Git Product home page Git Product logo

meep's People

Contributors

acerjan avatar ahoenselaar avatar alteholz avatar bencbartlett avatar brettle avatar christopherhogan avatar dependabot[bot] avatar droundy avatar hammy4815 avatar homerreid avatar ianwilliamson avatar jamesetouma avatar jianguan210 avatar joamatab avatar kkg4theweb avatar maruoka842 avatar mawc2019 avatar mochen4 avatar oskooi avatar pbermel avatar robind42 avatar scimax avatar seewhydee avatar smartalech avatar soamaven avatar stevengj avatar thchr avatar theogdoctorg avatar thomasdorch avatar yaraslaut avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

meep's Issues

model information

When I want to use this program,how can I input the information of model.
Does this program contains the grid generation,or need to write our own procedures to achieve this function

meep fails to run at command line, no arguments

Installed from source on CentOS 7.3.1611 with guile 2.0.9. MPB works just fine. Any thoughts?

Backtrace:
In ice-9/boot-9.scm:
 157: 10 [catch #t #<catch-closure 14beb20> ...]
In unknown file:
   ?: 9 [apply-smob/1 #<catch-closure 14beb20>]
In ice-9/eval.scm:
 432: 8 [eval # #]
 432: 7 [eval # #]
In unknown file:
   ?: 6 [primitive-load "/usr/local/share/meep/meep.scm"]
In ice-9/eval.scm:
 432: 5 [eval # #]
 432: 4 [eval # #]
In unknown file:
   ?: 3 [primitive-load "/usr/local/share/meep/meep-enums.scm"]
In ice-9/eval.scm:
 453: 2 [eval #<memoized (define Centered Dielectric)> ()]
 393: 1 [eval #<memoized Dielectric> ()]
In unknown file:
   ?: 0 [memoize-variable-access! #<memoized Dielectric> #<directory # 14aec60>]

ERROR: In procedure memoize-variable-access!:
ERROR: Unbound variable: Dielectric

Missing configure file in meep 1.3

Hi guys,
I am trying to install meep on a windows machine (Windows 10, x64) using the instructions from https://novelresearch.weebly.com/installing-meep-in-windows-8-via-cygwin.html
Everything works fine till I want to install meep itself. First thing is, that the configure file is missing in the meep folder. During the installation of libclt I did run into the same issue - however there the command sh autogen.sh_ did create the configure file. in the libctl folder
Therefore I did also run sh autogen.sh in the meep folder and indeed the configure file was created. Also running the command ./configure --prefix=/usr/local works.
But when running make I get the following error:

make[2]: *** No rule to make target 'sphere_quad'. Stop

I am aware of previous issues relating to the _'sphere_quad'_file (e.g. when installing MPB) but unfortunately I didn't find any hints concerning the 'sphere_quad'-file in this context.
Any suggestions are welcome!
Thanks and kind regards,
Lozenz

Feature Request: Parallel near2far calculation

The near2far calculation for 3D geometries takes a very long time (often longer than the simulation itself). It would be nice to get this part of the code parallelized. This would be especially useful for metasurface antenna type problems.

Another option might be to export the DFT'd nearfields in a readable format and we could write our own parallel near2far transformation in matlab/python. Currently we need to save the full nearfield for every timestep to do the transform which results in a lot of data and slows the calculation.

Feature request: gyrotropic media

I would like to re-open the topic of gyrotropic media in MEEP. (Related terms are also Faraday effect, nonreciprocal behaviour, magnetooptics etc. Mathematically, all these are represented by nonzero imaginary part of the off-diagonal components in the permittivity tensor.)

While I found no access to this functionality in its current version, there are hints it has been implemented years ago:

This would be a very interesting feature for many researchers. (I know it from personal collaboration with other people, and it has also been asked for repeatedly in the mailing lists.)

I believe the transfer of MEEP development to Github can finally get together people with the know-how with people having time and motivation to make it work: Could we merge the obviously existing nonreciprocal MEEP code, perhaps as a mere prototype or experimental feature?

Bug: Guile 2.0.13 not compatible with meep 1.3

I just installed meep and related prerequisites via Filip Dominic's configure script, python-meep-install. The first tutorial (1D waveguide), which I ran as tutorial1.ctl, reports errors such as:

Using MPI version 3.1, 1 processes

Initializing structure...
Working in 2D dimensions.
Computational cell is 16 x 8 x 0 with resolution 10
block, center = (0,0,0)
size (1e+20,1,1e+20)
axes (1,0,0), (0,1,0), (0,0,1)
dielectric constant epsilon diagonal = (12,12,12)
time for set_epsilon = 0.163085 s

creating output file "./tutorial1-eps-000000.00.h5"...
Backtrace:
In ice-9/boot-9.scm:
160: 12 [catch #t #<catch-closure 55a6ef700260> ...]
In unknown file:
?: 11 [apply-smob/1 #<catch-closure 55a6ef700260>]
In ice-9/eval.scm:
432: 10 [eval # #]
432: 9 [eval # #]
In unknown file:
?: 8 [primitive-load "tutorial1.ctl"]
In ice-9/eval.scm:
432: 7 [eval # #]
In ice-9/boot-9.scm:
703: 6 [map #<procedure 55a6efc04b40 at ice-9/eval.scm:416:20 (a)> #]
In ice-9/eval.scm:
399: 5 [eval # #]
387: 4 Exception thrown while printing backtrace:
ERROR: In procedure delete-meep-volume: Wrong type argument in position 1: #<finalized smob 55a6f00f4a00>

ERROR: In procedure %run-finalizers:
ERROR: In procedure delete-meep-volume: Wrong type argument in position 1: #<finalized smob 55a6f00f4de0>

Some deprecated features have been used. Set the environment
variable GUILE_WARN_DEPRECATED to "detailed" and rerun the
program to get more information. Set it to "no" to suppress
this message.

Filip Dominic found that everything ran without error on his local computer using Guile 2.0.11 but he reproduced the errors when he ran the same ctl file using Guile 2.0.13. He asked me to report this bug in 2.0.13.

fields instability when combining 2+ mirror symmetries with periodic boundary conditions in 3d

As has been already reported on the meep-discuss list, there is a bug which leads to field instabilities when combining 3 mirror-symmetry objects with periodic boundary conditions in 3d.

This is demonstrated by the following simple example of a point source in vacuum:

(set-param! resolution 10)
(set! geometry-lattice (make lattice (size 1 1 1)))

(set! sources (list (src (make source (src (make gaussian-src (frequency 1) (fwidth 1))) (component Ez) (center 0 0 0)))))

(set! k-point (vector3 0 0 0))

(set! symmetries (list (make mirror-sym (direction X))
                       (make mirror-sym (direction Y))
                       (make mirror-sym (direction Z) (phase -1))))

(define print-field (lambda () (print "ez:, " (meep-time) ", " (magnitude (get-field-point Ez (vector3 0 0 0))) "\n")))

(run-until 500 (at-every 3.4 print-field))

For this script, the fields begin to blow up early on:

-----------
Initializing structure...
Working in 3D dimensions.
Computational cell is 1 x 1 x 1 with resolution 10
Halving computational cell along direction x
Halving computational cell along direction y
Halving computational cell along direction z
time for set_epsilon = 0.000983953 s
-----------
ez:, 3.4000000000000004, 0.11299371272955527
ez:, 6.800000000000001, 1.3292101237116931
ez:, 10.200000000000001, 2.402271717485686
ez:, 13.600000000000001, 4.484463313924006
ez:, 17.0, 3.8996648343918103
ez:, 20.400000000000002, 0.13634058216335865
ez:, 23.8, 4.629265409516416
ez:, 27.200000000000003, 4.998705102736996
ez:, 30.6, 1.2298673000561804
ez:, 34.0, 2.6981516161770243
ez:, 37.4, 4.377899104931889
ez:, 40.800000000000004, 3.5929413491696733
ez:, 44.2, 15.319503849169674
ez:, 47.6, 1194.3195038491704
ez:, 51.0, 2915498.319503849
ez:, 54.400000000000006, 559645525.6804962
ez:, 57.800000000000004, 4232176173909.6807

There are no instabilities if the Z mirror-symmetry object is removed:

-----------
Initializing structure...
Working in 3D dimensions.
Computational cell is 1 x 1 x 1 with resolution 10
Halving computational cell along direction x
Halving computational cell along direction y
time for set_epsilon = 0.00132418 s
-----------
ez:, 3.4000000000000004, 0.11299371272955483
ez:, 6.800000000000001, 1.3292101237116927
ez:, 10.200000000000001, 2.402271717485691
ez:, 13.600000000000001, 4.484463313924063
ez:, 17.0, 3.899664834392148
ez:, 20.400000000000002, 0.13634058216075873
ez:, 23.8, 4.629265409492891
ez:, 27.200000000000003, 4.998705102472007
ez:, 30.6, 1.2298672971468396
ez:, 34.0, 2.6981515888410907
ez:, 37.4, 4.3779221161355615
ez:, 40.800000000000004, 3.6004200025033697
ez:, 44.2, 0.1855690755252214
ez:, 47.6, 4.1906787491702815
ez:, 51.0, 5.302888125953832
ez:, 54.400000000000006, 1.6628091344931328
ez:, 57.800000000000004, 2.9025704016780374

LDOS example

Hi,

We run the LDOS example, http://ab-initio.mit.edu/wiki/index.php/Meep_Tutorial/Local_density_of_states

We still have the different results for different cores for MPI. amrit-poudel reported this in the earlier thread. I thought this issue is solved. Just want to confirm or something we should pay attention to.

1 core:
ldos0:, 170.5442275590302 Q:, 285.4737253883177 f:, 0.6784055061639759
ldos1:, 154.20084087673828 f:, 0.6784055061639759,

5 cores:
ldos0:, 170.5442275590302 Q:, 285.4737253883177 f:, 0.6784055061639759
ldos1:, 154.20084087673828 f:, 0.6784055061639759,

10 cores:
ldos0:, 170.5442275590302 Q:, 285.4737253883177 f:, 0.6784055061639759
ldos1:, 77.10042043836914 f:,0.6784055061639759,

15 cores:
ldos0:, 170.5442275590302 Q:, 285.4737253883177 f:, 0.6784055061639759
ldos1:, 77.10042043836914 f:, 0.6784055061639759,

20 cores:
ldos0:, 170.5442275590302 Q:, 285.4737253883177 f:, 0.6784055061639759
ldos1:, 38.55021021918457 f:,0.6784055061639759

fields instability when combining dft forces with symmetry objects

There is a bug in the dft force feature when used with symmetry objects. This is demonstrated in the following simple example of 2d vacuum with PML boundaries involving a point source at the origin enclosed by a square box of force-regions. Given the symmetry of this arrangement, the total force obtained by summing up the four line regions should be 0. This is the result when no mirror-symmetry objects are used. However, the presence of any mirror-symmetry objects produces a non-zero value.

(set-param! resolution 10)

(set! geometry-lattice (make lattice (size 5 5 no-size)))

(set! pml-layers (list (make pml (thickness 1))))

(set! sources (list (make source (src (make gaussian-src (frequency 1.0) (fwidth 1.0)))
                          (center 0 0 0) (component Ez))))

(set! symmetries (list (make mirror-sym (direction X))
                       (make mirror-sym (direction Y))))

(define force-box (add-force 1 0 1
                             (make force-region (direction X) (center +1 0 0) (size 0 2 0) (weight +1.0))
                             (make force-region (direction X) (center -1 0 0) (size 0 2 0) (weight -1.0))
                             (make force-region (direction Y) (center 0 +1 0) (size 2 0 0) (weight +1.0))
                             (make force-region (direction Y) (center 0 -1 0) (size 2 0 0) (weight -1.0))))

(run-sources+ (stop-when-fields-decayed 50 Ez (vector3 0 0 0) 1e-6))

(display-forces force-box)

The above script produces the following incorrect non-zero output:

force1:, 1.0, 0.38368976835348184

When the symmetry objects are removed, the force has the correct value of 0:

force1:, 1.0, 1.7105999776390046e-16

Cannot make meep on macOS Sierra

I am using this "issue" space to record my attempts to get meep installed on macOS Sierra (10.12.5). It is probably best to scroll down to the bottom and read it in reverse chronological order to get an idea of the present state of the problem.

After downloading meep-master from GitHub, ./configure LDFLAGS=-L/usr/local/lib CPPFLAGS=-I/usr/local/include goes to completion and generates a make file. However, make. terminates with errors:

meep_wrap.cxx:1394:10: error: use of undeclared identifier 'SCM_VECTORP'
return SCM_VECTORP(o) && SCM_VECTOR_LENGTH(o) == 3;
^
meep_wrap.cxx:1394:28: error: use of undeclared identifier 'SCM_VECTOR_LENGTH'
return SCM_VECTORP(o) && SCM_VECTOR_LENGTH(o) == 3;
^
meep_wrap.cxx:53286:63: error: address of overloaded function '_wrap_do_harminv' does not match required type 'void'
scm_c_define_gsubr("do-harminv", 0, 0, 1, (swig_guile_proc) _wrap_do_harminv);

I tried modifying the Makefile.am in /libctl subdirectory, as suggested on the DarkAlex blog, https://darkalexwang.github.io/2016/10/06/python-meep-install-mac/, but either I don't know how to apply the edits properly or the edits are no longer effective. In any case the same errors appear after executing the make command.

Does anybody know if there is a simple edit somewhere to the "undeclared identifier" problem?
and/or "address of overloaded function" problem?


Update to this continuing issue: 6 July 2017:


I tried Oskool's suggestion below, following the recipe that one finds here, https://www.mail-archive.com/[email protected]/msg05719.html. I followed the recipe by using "home-brew uninstall guile" to get rid of guile 2.2.2. Then I downloaded guile 2.0.11 from the gnu index site, https://ftp.gnu.org/gnu/guile/ and tried to execute the configure, make, make install sequence. I was able to get the ./configure script to run to the end without error by using ./configure --prefix=/usr/local --enable-deprecated=YES LDFLAGS='-L/usr/local/opt/libffi/lib -L/usr/local/opt/readline/lib' CPPFLAGS='-I/usr/local/Cellar/libffi/3.2.1/lib/libffi-3.2.1/include -I/usr/local/opt/readline/include'
The long path CPPFLAGS was necessary to find ffi.h because home-brew did not symlink libffi to /usr/local. I thought setting "--enable-deprecated=YES" might be helpful for using deprecated stuff in guile that still works. However make still has problems:

"Undefined symbols for architecture x86_64:
"_clock_getcpuclockid", referenced from:
_scm_init_stime in libguile_2.0_la-stime.o
"_ffi_call", referenced from:
_scm_i_foreign_call in libguile_2.0_la-foreign.o
"_ffi_closure_alloc", referenced from:
_scm_procedure_to_pointer in libguile_2.0_la-foreign.o"

etc, etc (there are lots more "Undefined symbols for architecture x86_64"). The list of undefined symbols terminates with

"ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)"

So I cannot proceed with the meep installation recipe until I get guile 2.0.11 made and installed. Does anybody know if there is a workaround for the "undefined symbols" problem?


Update to this continuing issue: 7July 2017


I tried a different approach, downloading guile 2.0.11 from the GitHub repository and executing the autogen.sh script. It produced an error message,
"configure.ac:927: error: possibly undefined macro: AM_GNU_GETTEXT
If this token and others are legitimate, please use m4_pattern_allow."
I edited the configure.ac file by commenting out line 927 and using 'm4_pattern_allow' for AM_GNU_GETTEXT. This tactic allowed the generation of the configure file, but from there the story is the same as yesterday. The linking stop here:

"CCLD libguile-2.0.la
Undefined symbols for architecture x86_64:
"_clock_getcpuclockid", referenced from:
_scm_init_stime in libguile_2.0_la-stime.o
...
...
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)"


Update to this continuing issue: 8 July 2017:


According to https://www.gnu.org/software/gettext/manual/html_node/AM_005fGNU_005fGETTEXT.html,
AM_GNU_GETTEXT is a macro in a file called gettext.m4. Maybe autogen.sh doesn't find AM_GNU_GETTEXT because it does not look in the directory where gettext.m4 resides. I installed gettext with home-brew, thinking that perhaps getttext was missing from the macOS. A "find" on gettext.m4 locates it in /usr/local/Cellar/gettext/0.19.8.1/share/aclocal/gettext.m4. Is there some way to get autogen.sh or configure.ac to look in this path?


Update to this continuing issue: 9 July 2017:


I think I have solved the AM_GNU_GETTEXT missing macro problem. Here is what I do to get to a ./configure file for guile-2.0.11: First download the compressed tar package (I think these are called "tarballs") from the GitHub repository and tar -xvf to get the files in a guile-2.0.11 directory. One of the files is autogen.sh which is a bash script that calls configure.ac, an "autotool" that generates the appropriate configure executable. However, configure.ac stops with an error because it cannot find gettext.m4 in the m4 subdirectory. I installed gettext using home-brew, which puts other m4 (apparently more up-to-date) files in /usr/local/Cellar/gettext/0.19.8.1/share/aclocal. Then I just copied all the m4 files from /usr/local/Cellar... to .../guile-2.0.11/m4, then executed the autogen script again. Now it runs to completion and generates a configure executable. I then do

./configure LIBFFI_CFLAGS="-I/usr/local/opt/libffi/include -I/usr/local/Cellar/libffi/3.2.1/lib/libffi-3.2.1/include" LIBFFI_LIBS="-L/usr/local/opt/libffi/lib" LDFLAGS="-L/usr/local/opt/gettext/lib -L/usr/local/lib -L/usr/local/opt/readline/lib -L/usr/local/Cellar/gettext/0.19.8.1/lib" CPPFLAGS="-I/usr/local/opt/gettext/include -I/usr/local/include -I/usr/local/opt/readline/include -I/usr/local/Cellar/gettext/0.19.8.1/include".

The LIBFFI_CFLAGS and LIBFFI_LIBS are necessary because the ffi home-brew installation puts the ffi files in /usr/local/opt/libffi. Some of the LDFLAGS and CPPFLAGS may be superfluous but I don't dare change them because with these ./configure options the executable runs to completion and generates a make file in .../guile-2.0.11. Now I "make" which after awhile stops and generates the error message indicated above. Here it is again just for convenience:

Making all in libguile
/Applications/Xcode.app/Contents/Developer/usr/bin/make all-am
CCLD libguile-2.0.la
Undefined symbols for architecture x86_64:
"_clock_getcpuclockid", referenced from:
_scm_init_stime in libguile_2.0_la-stime.o
"_ffi_call", referenced from:
_scm_i_foreign_call in libguile_2.0_la-foreign.o
"_ffi_closure_alloc", referenced from:
_scm_procedure_to_pointer in libguile_2.0_la-foreign.o

and a few more "ffi" type of errors. The message terminates with

ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation).

Fishing around on Google, using "ffi" as bait, I found this: yallop/ocaml-ctypes#74 which describes similar symptoms. The problem apparently is that the macOS or Xcode comes with an out-of-date ffi library and the linker finds it in the search path before it looks in /usr/local/Cellar/libffi/3.2.1/lib, where the up-to-date ffi library is. Can someone help me modify the search path when the linker is invoked? Should I try to modify the make file in /Applications/Xcode.app/Contents/Developer/usr/bin/make ? Apparently one can also modify pkg-config, but I don't dare start messing around in these things without having a decent idea of what I am doing.

The first error message, "_clock_getcpuclockid", referenced from:
_scm_init_stime in libguile_2.0_la-stime.o, is a qualitatively different problem, I would guess.


Update to this continuing issue: 12 July 2017:


I added the following library path to the ./configure command,

-L/Users/johnweiner/Downloads/guile-2.0.11/lib

because the first missing symbol, "_clock_getcpuclockid" is associated with a header file, time.h. I found time. h in the above library path and ran ./configure with all the search path options again. Then "make V=1". The "V=1" option shows the compile and link commands and files explicitly as the make file executes. However, the problem remains unchanged.


Update to this continuing issue: 14 July 2017:


Finally figured out why the ffi symbols could not be found. The linker must be told where to find the relevant library and to link it, LIBFFI_LIBS="-L/usr/local/opt/libffi/lib -lffi". The "-lffi" flag had been missing all along. Including it in LIBFFI_LIBS eliminates all the errors associated with undefined ffi symbols. The one remaining is

Undefined symbols for architecture x86_64:
"_clock_getcpuclockid", referenced from:
_scm_init_stime in libguile_2.0_la-stime.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to see invocation)

Apparently this failure is due to a call in libguile/stime.c to "clock_getcpuclockid". There is a note about it in bug#23870, http://lists.gnu.org/archive/html/bug-guile/2016-06/msg00252.html. There is a patch to fix the problem for macOS, https://gist.github.com/rahulg/baa500e84136f0965e9ade2fb36b90ba/revisions, but I'm not sure how to apply this patch.

Update to this continuing issue: 18 July 2017:

The patch does not involve a lot of text so I just typed it into the pertinent file, .../libguile/stime.c, and then the usual "configure, make" sequence. The error remained unfixed. Finally I just downloaded a later version of guile in which the problem with stime.c was supposedly fixed (http://lists.gnu.org/archive/html/bug-guile/2017-03/msg00000.html). Then I replaced the stime.c from Guile-2.0.11 with the stime.c from the later version. Then configure and make. This time make goes to completion with lots of warnings but no errors. After sudo make install Guile-2.0.11 was finally installed and seemed to be functioning ok...at least I was able to get a guile prompt and the command, guile --version gives the correct version.

Now with guile installed I proceeded to install meep-1.3 which I downloaded directly from the meep MIT Ab Initio site. I had some initial problems getting meep-1.3 to "make" because somewhere the linker was getting a flag to a guile path that no longer, if ever, existed. Possibly this residual path remained from an earlier attempt to install guile with home-brew...before learning that Guile-2.2 would not work with meep. Even though I uninstalled the home-brew installation the linker path persisted.

Now meep-1.3 installs! I created the first tutorial ctl file for the straight waveguide. The program ran and generated the two expected h5 files. Now, however, the 5hutils command, h5topng, generates a segmentation fault, but h5utils problems should be the subject of a different "issue". For the time being at least I am assuming that the h5 files generated by the straight waveguide are not defective, and the problem is with h5utils.

Any suggestions would be greatly appreciated.

John

Unable to build libctl

I'm attempting to build Meep 1.3 to run under Windows via Cygwin64, following the steps here.

Unfortunately I'm stuck attempting to build libctl. There's no configure file found, and the autogen.sh errors out with "autoreconf: command not found"

Any words of wisdom?

thanks,
Chris

Lorentzian check reports instability even when stable

I would like to open the broad question of FDTD numerical stability. By trial-and-error I isolated two independent criteria of how a definition of a material can make the simulation unstable (the third mechanism of unstable PML for grazing incidence wave has been resolved in MEEP 1.2). In the treatise that follows, I tried to describe these two criteria.

In last 3 years I tested different materials thoroughly and these rules seem to hold so far (though I have no theory why).

What does not always hold, however, is the corresponding function `lorentzian_unstable' (in src/susceptibility.cpp). MEEP checks for the position of the oscillator pole in the frequency complex plane, based on some mathematics that I admit not to understand yet, and sometimes reports false positives. In the cases near the edge of stability it reports instability and aborts.
I experimentally verified that by disabling the function, recompiling and running the same simulation again, no instability developed even in very long time.

Of course, there are also false negatives, due to the second criterion: The simulation goes unstable whenever the permittivity at high frequencies is too low.

I suggest that first of all, the `lorentzian_unstable' should be changed to a warning only, it should be called only once after the initialisation (not in n-times every step), and no matter what the outcome, the simulation should continue. In the future, we should thoroughly test out the hypothesis attached below. This starts to be a bit experimental research instead of programming, but with so many people computing some plasmonics etc, it is important to make it clear.

Looking forward to your comments!
Filip

fdtd_stability1
fdtd_stability2

hdf5 chunk error for large simulations

When trying to save the output of a large simulation with more than 50000 grid points using parallel-MEEP 1.3, an H5D-error concerning h5-chunk sizes bigger than 4GB appears (see attached part of the output file). This was tested using both the parallel version of hdf5-1.08 and hdf5-1.10-alpha1, memory space is not an issue.

Is there a MEEP-integrated way of forcing smaller h5-chunks in the output files or a general workaround for this problem? Thanks for your help.

H5D-error.txt

MPI_Barrier() function called after MPI_FINALIZE was invoked.

I am running an mpi enabled version of meep with the Python interface. I have built and installed from source, and all the tests are passing.

The code

def test000(self):
    cell = mp.Vector3(16, 16, 0)
    geometry = [mp.Block(mp.Vector3(12, 1, 1e20),
                         center=mp.Vector3(-2.5, -3.5),
                         material=mp.Medium(epsilon=12)),
                mp.Block(mp.Vector3(1, 12, 1e20),
                         center=mp.Vector3(3.5, 2),
                         material=mp.Medium(epsilon=12))]
    pml_layers = [mp.PML(1.0)]
    resolution = 10

    sources = [mp.Source(mp.ContinuousSource(wavelength=2*(11 ** 0.5),
                                             width=20),
                         component=mp.Ez,
                         center=mp.Vector3(-7, -3.5),
                         size=mp.Vector3(0, 1))]

    sim = mp.Simulation(cell_size=cell,
                        boundary_layers=pml_layers,
                        geometry=geometry,
                        sources=sources,
                        resolution=resolution)

    sim.run(mp.at_beginning(mp.output_epsilon),
            mp.to_appended("ez", mp.at_every(0.6, mp.output_efield_z)),
            until=200)
    return

works without issue when using the non mpi enabled version of meep. However, with mpi I have the following error:

Compilation started at Mon Oct 16 18:52:53

nosetests -v test_basics\:TutorialBentWaveguide.test000

**
** successfully loaded python MPI module (mpi4py)
**
test000 (test_basics.TutorialBentWaveguide) ... time for set_epsilon = 0.428574 s
Working in 2D dimensions.
Computational cell is 16 x 16 x 0 with resolution 10
     block, center = (-2.5,-3.5,0)
          size (12,1,1e+20)
          axes (1,0,0), (0,1,0), (0,0,1)
          dielectric constant epsilon diagonal = (12,12,12)
     block, center = (3.5,2,0)
          size (1,12,1e+20)
          axes (1,0,0), (0,1,0), (0,0,1)
          dielectric constant epsilon diagonal = (12,12,12)
time for set_epsilon = 0.0348468 s
-----------
creating output file "./nosetests-eps-000000.00.h5"...
creating output file "./nosetests-ez.h5"...

Field time usage:
    connnecting chunks: 0.00792003 s
         time stepping: 0.491834 s
         communicating: 0.0852661 s
     outputting fields: 1.30064 s
    Fourier transforming: 0.00029254 s
       everything else: 0.0530965 s

ok

----------------------------------------------------------------------
Ran 1 test in 2.413s

OK
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):29473] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

Compilation exited abnormally with code 1 at Mon Oct 16 18:52:56

The function causing this problem is mp.to_appended -- replacing it with mp.at_end(mp.output_efield_z)) does not result in any errors.

The file created by mp.to_appended is nosetests-ez.h5, but it is corrupted.

$ h5debug nosetests-ez.h5 
HDF5-DIAG: Error detected in HDF5 (1.8.14) thread 0:
  #000: H5F.c line 604 in H5Fopen(): unable to open file
    major: File accessibilty
    minor: Unable to open file
  #001: H5Fint.c line 1085 in H5F_open(): unable to read superblock
    major: File accessibilty
    minor: Read failed
  #002: H5Fsuper.c line 277 in H5F_super_read(): file signature not found
    major: File accessibilty
    minor: Not an HDF5 file
cannot open file

It seems the program is being interrupted while writing to the .h5 file?

GDSII integration and Basic Shape/Structure Definitions

It would be helpful to have direct support of GDSII files (i.e. both importing into MEEP geometry and exporting). This would appeal to many researchers who use FDTD methods to optimize photonic crystal cavities, or other nanodevices which then require the pattern to be written to a GDSII file for fabrication. Furthermore, it would be helpful to include a quick and simple way to add common geometries such as circles, rectangles, to aid in rapid development.

Can't compile after installing

Hi All,

I've installed everything as instructed on the installation page of MEEP.
Ive installed harminv, openmpi, hdf5, and openblas. I anstalled meep using:

sudo apt-get install meep h5utils

Ok. Then I go to tutorials and try to compile using:

g++ -malign-double tutorial1.cpp -o tutorial1 -lmeep -lhdf5 -lz -lgsl -lharminv -llapack -lcblas -latlas -lfftw3 -lm

output:
tutorial1.cpp:1:20: fatal error: meep.hpp: No such file or directory compilation terminated.

STL import

I have implemented a method to convert the STL file to Yee gird and want to import the grid file to meep.
The function I need is like read_epsilon_file in the meepgeom.cpp. Is there any instruction。

Thank you indeed。

h5topng seg faults in macOS Sierra

Since h5utils is part of the meep constellation, I thought this might be the appropriate venue for this problem. After having installed meep-1.3 on macOS Sierra (see issue #64), I find that, when invoking the h5topng command with the appropriate options (-S3) operating on the file, test1-eps-000000.00.h5, generated by the first tutorial meep program (straight wave guide), I get a memory error,

h5topng -S3 test1-eps-000000.00.h5
h5topng(28360,0x7fffe584e3c0) malloc: *** mach_vm_map(size=18446744049097375744) failed (error code=3)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
h5topng error: out of memory

subsequent invocations of h5topng simply results in a segmentation fault. The h5utils package was downloaded from the meep MIT Abinitio site and the appropriate patch, hdf5_h5utils_libpng_patch, was applied with the patch command. The "configure", "make", "sudo make install" without incident.

I have no reason to believe the h5 file is defective although I don't know how to verify this independently. The "meep test1.ctl >& test1.out" command runs, produces the h5 file, and no errors.

Does anybody know how to diagnose the error allocation problem and fix it?

Python import error simulation.py

I installed meep with the Python interface, essentially following the procedure here: https://www.mail-archive.com/[email protected]/msg05819.html

In addition to including the flags '--with-python', and '--enable-shared' which I deduced were necessary from looking at configure.sh.

After running make install, 'import meep' failed with a message about missing 'simulation.py'. I found that simulation.py had simply not been copied to .../site-packages/meep/, and manually doing so seemed to fix the problem. I couldn't find anything relevant in the Makefiles except

PY_PKG_FILES = \ __init__.py \ $(srcdir)/geom.py \ $(srcdir)/simulation.py \ $(srcdir)/source.py \ .libs/_meep.so
In any case, I'm guessing there was some small bug in the install process, perhaps someone else will corroborate this experience.

Proposed contributions by Arthur Thijssen

As Arthur Thijssen wrote in the mailing list (27 March 2014), he proposes several new features to MEEP. They are held separately in the project here on github https://github.com/Arthur-Thijssen/MEEP-actt/tree/master/src, maybe they should be converted to an experimental fork of MEEP.

The features listed in the mail [2] are

- Output of fields / field components in the frequency domain. 
- Near to far field transform. Using the algorithm described in "Computational 
- Mode volumes in the frequency domain.

Meep on Mac OS Sierra

Meep 1.3 seems to be incompatible with Guile 2.0.12. Any plans to update Meep to work with this version?

For example, the straight waveguide example gives

creating output file "./eps-000000.00.h5"...
ERROR: In procedure %run-finalizers:
ERROR: In procedure delete-meep-volume: Wrong type argument in position 1: #<finalized smob 10aa7dc80>

the issue with the installation with python

Hi,

I can compile meep --without-python, But if I turn on the python option, I will have the following error message,
" No rule to make target ../src/libmeep.la', needed by _meep.la'. Stop."

Thanks,

Xin

Any working build for MacOS X 10.11 or 12 (El Cap. or Sierra)?

Is there a working build available? On El Capitan (and now on Sierra) dependencies compiled fine, but meep itself won't: configure: error: C++ compiler cannot create executables
I was using the ./autogen.sh --with-mpi --enable-maintainer-mode --enable-shared --prefix=/usr/local command to build.

meep not linking with fftw3_mpi

Hi,
I compiled (with gcc 4.8.3) mpi version of meep (v 1.3) with mpi version of fftw3 (single threaded, static library, libfftw3_mpi.a).

I am confident that my meep installation is fine since I have ran many simulations and compared numerical and analytical results.

However, when I tried computing LDOS, which requires Fourier transform and is already implemented in meep, I get the following error during linking.

monitor.cpp:(.text+0x2c87): undefined reference to fftw_plan_dft_1d' monitor.cpp:(.text+0x2c92): undefined reference tofftw_execute'
monitor.cpp:(.text+0x2c9a): undefined reference to `fftw_destroy_plan'

I have -lfftw3_mpi linker and have correctly set -L and -I flags to the folder where my libfttw3_mpi.a file is located.

It works fine if I use -lfftw3 linker instead.
I am not sure why meep looks for fftw3, instead of fftw3_mpi.

LDOS in 2D

I tried comparing LDOS computed in meep with analytical value in vacuum, but could not get the two results agree.

Analytically, LDOS in 2D = 2*\omega/(\pi c^2) * Tr[ Im [G] ], G = dyadic Greens function of the vacuum. In 2D, Im[G] = 1/4, for all three diagonal components.

I use a continuous-wave point source at a fixed frequency to compute the LDOS at that frequency and at that location(I could use other temporal sources to get LDOS at several frequencies in a single FDTD run, but for now, I just want to compare the numerical with analytical results and for simplicity I choose a continuous-wave source).

I am using meep as a c++ library and use dft_ldos object with update and ldos functions to compute the LDOS.

The result is off by an order of magnitude.

What is the unit of LDOS in 2D in meep?

FDFD problems on Meep-MPI

I am currently attempting to use Meep's FDFD methods to determine the frequency response of a system. When run with serial Meep this works fine, but when run with parallel Meep (Meep-MPI) it outputs that it has finished after one step with zero residual. See below for the output:

Using MPI version 3.0, 48 processes


Initializing structure...

Working in 3D dimensions.

Computational cell is 20 x 20 x 20 with resolution 5

 block, center = (0,0,0)

      size (5,5,5)

      axes (1,0,0), (0,1,0), (0,0,1)

      dielectric constant epsilon diagonal = (4,4,4)

time for set_epsilon = 0.060272 s

time for set_conductivity = 0.000830173 s

time for set_conductivity = 0.000797987 s

time for set_conductivity = 0.000797987 s


Meep: using complex fields.

on time step 1 (time=0.1), 9.64879 s/step

final residual = 0

Finished solve_cw after 1 steps and 0 CG iters.

creating output file "./fdfd-ex-000000.10.h5"...

creating output file "./fdfd-ey-000000.10.h5"...

creating output file "./fdfd-ez-000000.10.h5"...

Using setuptools and wheels for python build

Hi,
Thanks for the project, it is great. This is what I was looking for.

However, I noticed that the build in python is performed using make and not setuptools. Setuptools automate the build process and increases the compatibility of the code/build in windows/linux/macos.

I would like to add a build process for the project using setuptools. Also, setuptools provides ability to make wheels which increases the distributions because code won't build at each machine. Wheels can also decrease the build errors significantly.
https://packaging.python.org/tutorials/distributing-packages/

This will also reduce the dependency on autotools.

Thanks

field-energy-in-box returns 0.0 always

I am running Meep in fdtd mode, with a plane wave hitting a sphere and pml coating the computational box. I am interested in the point at which transients die down, and so wanted to have Meep output the total field energy in the box. Following the example in the reference guide I was able to get it to output 0.0 at every stage (which is incorrect given that there is energy in the system). Here's my control code:

(define-param xSize 6.0)
(define-param ySize 6.0)
(define-param zSize 30.0)
(define-param pmlSize 2.0)
(define-param sphereSize 0.5)
(define-param realeps -2.00868962)
(define-param freq (* 1 0.002099383))
(define-param imageps 3.48099144)
(define-param res 5)

(set! geometry-lattice (make lattice (size xSize ySize zSize)))

(set! geometry (list
(make sphere (center 0 0 (- 10)) (radius sphereSize)
(material (make dielectric (epsilon realeps) (D-conductivity (/ (* 2 pi
freq imageps) realeps)))))))

(set! pml-layers (list (make pml (thickness pmlSize))))

(set! sources (list (make source (src (make continuous-src (frequency freq))) (component Ex) (center 0 0 15)
(size xSize ySize 0) (amplitude
1))))

(set! resolution res)

(set! force-complex-fields? true)

(define (netE) (print "Net Energy:" (field-energy-in-box (meep-fields-total-volume fields)) "\n" ))

(run-until 10000
(at-every 1 netE)
)

Feature request: Angular radiation (or scattering) pattern calculation

A standard problem that computational electromagnetic tools are used for is calculating the farfield radiation pattern https://en.wikipedia.org/wiki/Radiation_pattern either from a driven element (e.g. antenna) or scattered from an object (e.g. Mie scattering or Radar Cross Section). Meep's Near2Far calculation is helpful but its not easy to get the entire angluar (theta,phi) without extensive postprocessing.
It would be very nice to have a feature where we specify the frequency points (for DFT), angluar resolution or number of points in theta/phi and Meep outputs an HDF5 file with theta, phi, farfield. For example pattern(freqencies, thetas, phis) -> farfield_pattern.h5.

make fails: swig generates errors

Trying to install meep-1.3 on a MacBookPro running macOS Sierra. All the dependencies and prerequisites are installed, including guile-2.0.11 and libctl. Downloaded the meep-1.3 tarball from GitHub, extracted and ran sh autogen.sh. Configure ran to completion, but make stopped with the following error:

swig -I../src -c++ -guile -o meep_wrap.cxx meep.i
make[2]: swig: No such file or directory

Used "find ~ -name meep.i" to search for meep.i and found it in

$HOME/install/meep-1.3/libctl/meep.i

also

$HOME/install/meep-1.3/python/meep.i

Some environment variable needs the path, but I'm not sure which one.

Need some help.


update 12 August 2017

Tried installing SWIG from GitHub repository to see if it would help. First I removed meep directory. Then I cloned SWIG from Git-Hub. SWIG installed without error. Then I cloned meep from stevengj/meep and ran sh autogen.sh. Script ran without stopping. Then I ran make. Make ended with 12 errors, all of a similar message:

meep-ctl-swig.hpp:32: Error: 'dft_flux_flux' is multiply defined in the generated target language module.
../src/meep.hpp:896: Error: Previous declaration of 'dft_flux_flux'
meep-ctl-swig.hpp:33: Error: 'dft_force_force' is multiply defined in the generated target language module.
../src/meep.hpp:925: Error: Previous declaration of 'dft_force_force'
meep-ctl-swig.hpp:34: Error: 'dft_ldos_ldos' is multiply defined in the generated target language module.
../src/meep.hpp:1000: Error: Previous declaration of 'dft_ldos_ldos'
meep-ctl-swig.hpp:35: Error: 'dft_ldos_F' is multiply defined in the generated target language module.
../src/meep.hpp:1001: Error: Previous declaration of 'dft_ldos_F'
meep-ctl-swig.hpp:36: Error: 'dft_ldos_J' is multiply defined in the generated target language module.
../src/meep.hpp:1002: Error: Previous declaration of 'dft_ldos_J'
meep-ctl-swig.hpp:37: Error: 'dft_near2far_farfield' is multiply defined in the generated target language module.
../src/meep.hpp:956: Error: Previous declaration of 'dft_near2far_farfield

These errors are different from yesterday, but appear to closely resemble errors reported here, https://www.mail-archive.com/[email protected]/msg05173.html


update 14 August

Got meep-1.3 installed on macOS Sierra, running on a MacBookPro. For some reason ONLY the tarball downloaded from the meep ab initio site installs without error. Both the git clone and the tarball on GitHub generate the errors shown on the 12 August update. I also found that only h5utils cloned from the GitHub work correctly with h5 files generated by meep operating on a foo.ctl file (specifically h5topng). Other installations of h5utils from home-brew or Macports result in seg faults.

field-energy-in-box and MPI

Dear All,

I am not sure that it is a bug, but decided to report it any way. There is a chance that it is a local problem of our computer cluster but it could be an issue with MEEP.

In my script, I have this command:
'(define-param EQW1 (field-energy-in-box (volume (center 0 0 zQW1) (size wbox wbox tQW))))'
It supposed to calculate energy in a layer.

Meep fails to do this and throws out an error message
[node127:121267] *** An error occurred in MPI_Allreduce
[node127:121267] *** reported by process [1095958529,1946303531564662784]
[node127:121267] *** on communicator MPI_COMM_WORLD
[node127:121267] *** MPI_ERR_IN_STATUS: error code in status
[node127:121267] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[node127:121267] *** and potentially your MPI job)

This is the error message of MEEP build with OpenMPI 1.8.3. If MEEP is build with support of MPICH2 2.1, then attempts to run the code sometimes end with a similar error message.

What it interesting:
Meep throws out the error randomly (or I cannot spot any systematic behaviour). 5-6 out to 10 attempts to run this code end with the error (the rest of attempts end with a success). The error occurs only when I use multiple nodes.

Compiler: icc 14.0.0 (gcc version 4.4.7 compatibility)

Any help is appreciated.

Let me know if you need more information.

best wishes

p.s. many thanks for adding far field calculations in meep.

Julia version or interface for MEEP?

Just wonder if anyone has a chance to make a Julia version of MEEP, or a Julia interface to MEEP? This is out of my wish that there should be more packages written in Julia, and I notice that Steven is an active contributor to Julia language repo... If there is such a plan in the air, I may be able to work on other tools interact with MEEP in Julia. Thanks.

Define meep.speed_of_light, meep.eps0 and meep.mu0

Before I start editing my copy of MEEP source code, I would like to discuss one feature proposal that could help many newcomers (see http://meepunits.wikia.com/wiki/Meep_unit_transformation_Wiki, http://comments.gmane.org/gmane.comp.science.electromagnetism.meep.general/1406 etc.)

Could MEEP implement the following internal constants: meep.speed_of_light, meep.eps0 and meep.mu0?

By default, they may be set to 1., 1. and 1. to hold backward compatibility. Many people including me would probably set them to 2.997e8, 8.85e-12, 1.25e-6, respectively, so that all simulation settings and results are in natural SI units.

If implemented, we would have to pay attention that all commands receiving or returning data correctly divide/multiply by these constants, but this should not affect the timestepping speed. -- F D

Benchmarking Meep in 3D

After careful simulation, Meep's 3D Green's function agreed with analytical expression of free space and of a metallic sphere.
I decided to close this issue.

Make fails: undeclared identifier 'SCM_VECTORP' and 'SCM_VECTOR_LENGTH'

On Mac OS X with Guile 2.2.2, make fails with the following errors:

meep_wrap.cxx:1384:10: error: use of undeclared identifier 'SCM_VECTORP'
return SCM_VECTORP(o) && SCM_VECTOR_LENGTH(o) == 3;
^
meep_wrap.cxx:1384:28: error: use of undeclared identifier 'SCM_VECTOR_LENGTH'
return SCM_VECTORP(o) && SCM_VECTOR_LENGTH(o) == 3;

It worked for me to update line 1384 in libctl/meep_wrap.cxx to

return scm_is_vector(o) && scm_c_vector_length(o) == 3;

inspired by the issue here: https://github.com/stevengj/meep/issues/16 (I know this is almost the same content, but I think this titles makes it easy for users with similar issues to find the solution).

I did not make a pull request on this, since I have never used guile for anything else and therefore don't know about backwards compatibility and so on, but I guess it needs to be updated to ensure that users with newer versions of guile still can compile Meep.

configure can't find ctl.h

I am trying to install meep and company on an iMac running 10.12.5 (Sierra). Harminv and libctl install without error. When I try to install meep itself, however, the configure script complains that it cannot find the ctl.h header.

configure:25526: error: Couldn't find the ctl.h header file for libctl.
ac_cv_header_ctl_h=no

However ctl.h is where it is supposed to be. A "find -name ctl.h", results in
/usr/local/include/ctl.h
/usr/local/share/libctl/ctl.h

The config.log does not appear to be any more enlightening. Is there some way to know where the configure script is looking for ctl.h?
Any ideas?

Guile deprecated features

Small warning:

I got the following messages from Guile when running meep

Some deprecated features have been used.  Set the environment
variable GUILE_WARN_DEPRECATED to "detailed" and rerun the
program to get more information.  Set it to "no" to suppress

When setting it to detailed

SCM_VECTORP is deprecated.  Use scm_is_vector instead.
SCM_VECTOR_LENGTH is deprecated.  Use scm_c_vector_length instead.
GNU Guile 2.0.11
Copyright (C) 1995-2014 Free Software Foundation, Inc.

Guile comes with ABSOLUTELY NO WARRANTY; for details type `,show w'.
This program is free software, and you are welcome to redistribute it
under certain conditions; type`,show c' for details.

Enter `,help' for help.
scheme@(guile-user)>```

Running this on Fedora 21. 

meep installation on MacOS

I'm not sure if here is the right place to report this, let me know if I need to refer to other channel.

After installing, mpb, harminv, and libctl successfully, I came across the following error after trying make in the meep directory.

make
/Applications/Xcode.app/Contents/Developer/usr/bin/make  all-recursive
Making all in src
/Applications/Xcode.app/Contents/Developer/usr/bin/make  all-am
  CXX      bands.lo
bands.cpp:366:7: warning: comparison of array 'this->k' equal to a null pointer is
      always false [-Wtautological-pointer-compare]
  if (k == 0 && gv.dim == Dcyl && m != 0) fields_considered /= 2;
      ^    ~
1 warning generated.
  CXXLD    libmeep.la
Making all in libctl
gen-ctl-io --cxx --header -o ctl-io.h meep.scm /usr/local/share/libctl
;;; note: source file /usr/local/share/libctl/base/include.scm
;;;       newer than compiled /Users/jvmirca/.cache/guile/ccache/2.0-LE-8-2.0/usr/local/share/libctl/base/include.scm.go
;;; note: auto-compilation is enabled, set GUILE_AUTO_COMPILE=0
;;;       or pass the --no-auto-compile argument to disable.
;;; compiling /usr/local/share/libctl/base/include.scm
;;; compiled /Users/jvmirca/.cache/guile/ccache/2.0-LE-8-2.0/usr/local/share/libctl/base/include.scm.go
cp -f /usr/local/share/libctl/base/main.c main.cpp
cp -f /usr/local/share/libctl/utils/geom.c geom.cpp
gen-ctl-io --cxx --code -o ctl-io.cpp meep.scm /usr/local/share/libctl
gen-ctl-io --cxx --swig -o ctl-io.i meep.scm /usr/local/share/libctl
/Applications/Xcode.app/Contents/Developer/usr/bin/make  all-am
  CXX      meep.o
  CXX      structure.o
structure.cpp:1511:41: warning: conversion from string literal to 'char *' is
      deprecated [-Wc++11-compat-deprecated-writable-strings]
  number no_size = 2.0 / ctl_get_number("infinity");
                                        ^
1 warning generated.
  CXX      meep_wrap.o
meep_wrap.cxx:1389:10: warning: 'scm_i_vectorp' is deprecated
      [-Wdeprecated-declarations]
  return SCM_VECTORP(o) && SCM_VECTOR_LENGTH(o) == 3;
         ^
/usr/local/Cellar/guile/2.0.12_1/include/guile/2.0/libguile/deprecated.h:475:32: note: 
      expanded from macro 'SCM_VECTORP'
#define SCM_VECTORP(x)         scm_i_vectorp(x)
                               ^
/usr/local/Cellar/guile/2.0.12_1/include/guile/2.0/libguile/deprecated.h:467:20: note: 
      'scm_i_vectorp' has been explicitly marked deprecated here
SCM_DEPRECATED int scm_i_vectorp (SCM x);
                   ^
meep_wrap.cxx:1389:28: warning: 'scm_i_vector_length' is deprecated
      [-Wdeprecated-declarations]
  return SCM_VECTORP(o) && SCM_VECTOR_LENGTH(o) == 3;
                           ^
/usr/local/Cellar/guile/2.0.12_1/include/guile/2.0/libguile/deprecated.h:476:32: note: 
      expanded from macro 'SCM_VECTOR_LENGTH'
#define SCM_VECTOR_LENGTH(x)   scm_i_vector_length(x)
                               ^
/usr/local/Cellar/guile/2.0.12_1/include/guile/2.0/libguile/deprecated.h:468:30: note: 
      'scm_i_vector_length' has been explicitly marked deprecated here
SCM_DEPRECATED unsigned long scm_i_vector_length (SCM x);
                             ^
meep_wrap.cxx:53281:63: error: address of overloaded function '_wrap_do_harminv' does
      not match required type 'void'
  scm_c_define_gsubr("do-harminv", 0, 0, 1, (swig_guile_proc) _wrap_do_harminv);
                                                              ^~~~~~~~~~~~~~~~
meep_wrap.cxx:51198:1: note: candidate function
_wrap_do_harminv (SCM s_0, SCM s_1, SCM s_2, SCM s_3, SCM s_4, SCM s_5, SCM s_6...
^
meep_wrap.cxx:49879:1: note: candidate function
_wrap_do_harminv(SCM rest)
^
2 warnings and 1 error generated.
make[3]: *** [meep_wrap.o] Error 1
make[2]: *** [all] Error 2
make[1]: *** [all-recursive] Error 1
make: *** [all] Error 2

The issue seems to be with the guile library that I installed via home-brew... Anyone has any clue on this?

Thanks!

Make errors building from source in Debian Testing(Buster)

Hello everyone,

I'm trying to build meep from sources from the master branch, (which shows the "build passed" flag at the moment). My machine uname -a shows,

Linux 4.12.0-1-amd64 #1 SMP Debian 4.12.6-1 (2017-08-12) x86_64 GNU/Linux

Installed the following prerequisites recommended from the Meep readthedocs page,

apt install libctl5 libctl-dev guile-2.0-dev guile-2.0 mpi-default-bin mpi-default-dev swig libhdf5-dev libhdf5-cpp-100 libhdf5-100 libhdf5-mpi-dev libhdf5-openmpi-dev h5utils hdf5-tools

Once the ./configure --with-mpi ends y do make -j4 and the following errors appear,

bend-flux-ll.cpp:18:10: fatal error: ctl-math.h: No existe el fichero o el directorio(spanish for "file not found")
 #include "ctl-math.h"
meepgeom.cpp:44:26: error: invalid conversion from ‘void*’ to ‘SCM {aka scm_unused_struct*}’ [-fpermissive]
 material_type vacuum = { (void *)&vacuum_material_data };
                          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
meepgeom.cpp: In function ‘material_type meep_geom::make_dielectric(double)’:
meepgeom.cpp:61:24: error: invalid conversion from ‘void*’ to ‘SCM {aka scm_unused_struct*}’ [-fpermissive]
   material_type mt = { (void *)md };
                        ^~~~~~~~~~
meepgeom.cpp: In function ‘void meep_geom::epsilon_file_material(meep_geom::material_data*, vector3)’:
meepgeom.cpp:352:25: error: invalid conversion from ‘void*’ to ‘SCM {aka scm_unused_struct*}’ [-fpermissive]
   default_material.data=(void *)md;
                         ^~~~~~~~~~

I tried to find ctl-math.h but I couldn't find it (find / -name "ctl-math.h") at standard places like /usr/include and even in the whole system and nothing was found.
Since the officially packaged meep for Debian doesn't work either, I find the use or compilation of meep virtually impossible, since the issue #57 is present on my system too.

Eigenmode source calculated at wrong position in 3D

Hi everybody,

when I have a 3d geometry of a straight waveguide, parallel to the z-direction and I use the eigenmode-source on one side, I get the correct eigenmode there and thus the correct propagation through the waveguide.
Anyway, when I taper the waveguide for example and again use the eigenmode source at one end of the waveguide, I do not get the correct eigenmode! After testing I found out, that the eigenmode is always calculated in the z-center. Means: Even if I place the eigenmode source at one end of the waveguide, let's say (center 0 0 14) (size sx sy 0), the eigenmode is calculated at (center 0 0 0) (size sx sy 0).
Not even the (eig-lattice-center 0 0 14) (eig-lattice-size sx sy 0) commands can change this.

Can somebody confirm this bug?

Best regards
Marc

Python test failures: MPI_Comm_size() called before MPI_INIT

I have installed meep with the Python interface from the procedure here: https://www.mail-archive.com/[email protected]/msg05819.html

The non mpi install seems to work, and passes all the tests in make check. However, when building with the --with-mpi flag passed to configure.sh, 9 of the tests fail. Here is an excerpt from python/test-suite.log

meep 1.3: python/test-suite.log

# TOTAL: 11
# PASS: 2
# SKIP: 0
# XFAIL: 0
# FAIL: 9
# XPASS: 0
# ERROR: 0

.. contents:: :depth: 2

FAIL: tests/3rd_harm_1d

*** The MPI_Comm_size() function was called before MPI_INIT was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[Jyan-laptop2:10071] Local abort before MPI_INIT completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

The tests bend_flux, cyl_ellipsoid, holey_wvg_bands, holey_wvg_cavity, material_dispersion, physical, ring, and source fail with the exact same message.

The tests simulation.py and geom.py are passing.

meep and python-meep install on MacOS Sierra

I tried to install meep and python-meep on MacOS Sierra, followed the Glenn's procedure in meep-discuss archive with some minor modification. I posted my procedure on my blog.

Currently, meep "make check" passed all the tests except symmetry which is fine. Python-meep works fine too with the given example. But both "meep" or "meep-mpi" with scheme file failed with the following error:

Using MPI version 3.1, 1 processes
Backtrace:
In ice-9/boot-9.scm:
 160: 10 [catch #t #<catch-closure 104c50680> ...]
In unknown file:
   ?: 9 [apply-smob/1 #<catch-closure 104c50680>]
In ice-9/eval.scm:
 432: 8 [eval # #]
 432: 7 [eval # #]
In unknown file:
   ?: 6 [primitive-load "ring-cyl.ctl"]
In ice-9/eval.scm:
 467: 5 [eval # ()]
 387: 4 [eval # ()]
 387: 3 [eval # ()]
 387: 2 [eval # ()]
 393: 1 [eval #<memoized Ez> ()]
In unknown file:
   ?: 0 [memoize-variable-access! #<memoized Ez> #<directory # 104c28bd0>]

ERROR: In procedure memoize-variable-access!:
ERROR: Unbound variable: Ez

Seems like guile problem, but really not sure how to fix this.
Any suggestion will be appreciated and please let me know if you need more information.

Thanks,

meep build failure for ppc64 architecture

  • as reported on osc server (1_) some tests for meep package are failing
    for ppc/ppc64 but not for ppc64le.
  • I started to work on the aniso_disp failing test for ppc64.
  • using gdb (2_) I identified a problem for ppc64
    in generated assembly code when calling harminv_get_amplitude from do_harminv
  • as detailed in (4_) the gcc version is 4.8.3 for two opensuse environments.

(1_) https://build.opensuse.org/package/show/openSUSE:Factory:PowerPC/meep
=== extract of tests/test-suite.log:
harminv: failure on line 853 of harminv.c: argument out of range in harminv_get_amplitude

(2_) what is abnormal is the assembly code when calling harminv_get_amplitude
that is passing 3 registers r3 r4 r5 (with hd in r4 and 0 in r5)
while the library is expecting only r3 r4 (with hd in r3 and 0 in r4)

Breakpoint 2, 0x00003fffb7276204 in .harminv_get_amplitude () from /usr/lib64/libharminv.so.2
(gdb) bt
#0 0x00003fffb7276204 in .harminv_get_amplitude () from /usr/lib64/libharminv.so.2
#1 0x00003fffb7ee97b4 in meep::do_harminv (data=, n=, dt=0.0025000000000000001, fmin=0, fmax=1, maxbands=, amps=0x3fffffffe730,
freq_re=0x3fffffffe6b0, freq_im=0x3fffffffe6f0, errors=0x0, spectral_density=, Q_thresh=50, rel_err_thresh=1e+20, err_thresh=0.01, rel_amp_thresh=-1,
amp_thresh=-1) at bands.cpp:472
#2 0x00000000100020a0 in main (argc=1, argv=0x3ffffffff8e8) at aniso_disp.cpp:131

0x00003fffb7ee97a0 meep::do_harminv()+784: ld r2,40(r1)
0x00003fffb7ee97a4 meep::do_harminv()+788: mr r3,r22
0x00003fffb7ee97a8 meep::do_harminv()+792: mr r4,r31
0x00003fffb7ee97ac meep::do_harminv()+796: li r5,0
0x00003fffb7ee97b0 meep::do_harminv()+800: bl 0x3fffb7edd028 <00000017.plt_call.harminv_get_amplitude>
0x00003fffb7ee97b4 meep::do_harminv()+804: ld r2,40(r1)

=> 0x00003fffb7276204 <.harminv_get_amplitude+20>: blt- cr7,0x3fffb727628c <.harminv_get_amplitude+156>
0x00003fffb7276208 <.harminv_get_amplitude+24>: lwz r9,20(r3)
0x00003fffb727620c <.harminv_get_amplitude+28>: mr r31,r3
0x00003fffb7276210 <.harminv_get_amplitude+32>: cmpw cr7,r9,r4
0x00003fffb7276214 <.harminv_get_amplitude+36>: ble- cr7,0x3fffb727628c <.harminv_get_amplitude+156>

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.