I left a build going overnight and came back to a stuck build, with the following error:
WARNING:root:Accessing the Emscripten cache at "/home/.../.emscripten_cache/asmjs" is taking a long time, another process should be writing to it. If there are none and you suspect this process has deadlocked, try deleting the lock file "/home/.../.emscripten_cache.lock" and try again. If this occurs deterministically, consider filing a bug.
tools/ninja -C out/debug ui
But it was stuck with no apparent progress or obvious (i.e. glance at top
) resource usage for more than 20 minutes.
[1/10] python ../../gn/standalone/write_ui_dist_file_map.py --out obj/ui/gen/dist_file_map.ts --strip ui ui/assets/brand.png ui/assets/favicon.png ui/assets/logo-3d.png ui/assets/rec_atrace.png ui/assets/rec_battery_counters.png ui/assets/rec_board_voltage.png ui/assets/rec_cpu_coarse.png ui/assets/rec_cpu_fine.png ui/assets/rec_cpu_freq.png ui/assets/rec_cpu_voltage.png ui/assets/rec_cpu_wakeup.png ui/assets/rec_ftrace.png ui/assets/rec_java_heap_dump.png ui/assets/rec_lmk.png ui/assets/rec_logcat.png ui/assets/rec_long_trace.png ui/assets/rec_mem_hifreq.png ui/assets/rec_meminfo.png ui/assets/rec_native_heap_profiler.png ui/assets/rec_one_shot.png ui/assets/rec_ps_stats.png ui/assets/rec_ring_buf.png ui/assets/rec_vmstat.png ui/assets/perfetto.scss ui/assets/typefaces.scss ui/assets/sidebar.scss ui/assets/topbar.scss ui/assets/record.scss ui/assets/common.scss ui/assets/modal.scss ui/assets/details.scss ui/assets/catapult_trace_viewer.html ui/assets/catapult_trace_viewer.js ui/controller_bundle.js ui/controller_bundle.js.map ui/engine_bundle.js ui/engine_bundle.js.map ui/frontend_bundle.js ui/frontend_bundle.js.map ui/index.html ui/perfetto.css ui/assets/MaterialIcons.woff2 ui/assets/Raleway-Regular.woff2 ui/assets/Raleway-Thin.woff2 ui/assets/RobotoCondensed-Light.woff2 ui/assets/RobotoCondensed-Regular.woff2 ui/assets/RobotoMono-Regular.woff2 ui/trace_processor.wasm ui/trace_to_text.wasm
FAILED: obj/ui/gen/dist_file_map.ts
python ../../gn/standalone/write_ui_dist_file_map.py --out obj/ui/gen/dist_file_map.ts --strip ui ui/assets/brand.png ui/assets/favicon.png ui/assets/logo-3d.png ui/assets/rec_atrace.png ui/assets/rec_battery_counters.png ui/assets/rec_board_voltage.png ui/assets/rec_cpu_coarse.png ui/assets/rec_cpu_fine.png ui/assets/rec_cpu_freq.png ui/assets/rec_cpu_voltage.png ui/assets/rec_cpu_wakeup.png ui/assets/rec_ftrace.png ui/assets/rec_java_heap_dump.png ui/assets/rec_lmk.png ui/assets/rec_logcat.png ui/assets/rec_long_trace.png ui/assets/rec_mem_hifreq.png ui/assets/rec_meminfo.png ui/assets/rec_native_heap_profiler.png ui/assets/rec_one_shot.png ui/assets/rec_ps_stats.png ui/assets/rec_ring_buf.png ui/assets/rec_vmstat.png ui/assets/perfetto.scss ui/assets/typefaces.scss ui/assets/sidebar.scss ui/assets/topbar.scss ui/assets/record.scss ui/assets/common.scss ui/assets/modal.scss ui/assets/details.scss ui/assets/catapult_trace_viewer.html ui/assets/catapult_trace_viewer.js ui/controller_bundle.js ui/controller_bundle.js.map ui/engine_bundle.js ui/engine_bundle.js.map ui/frontend_bundle.js ui/frontend_bundle.js.map ui/index.html ui/perfetto.css ui/assets/MaterialIcons.woff2 ui/assets/Raleway-Regular.woff2 ui/assets/Raleway-Thin.woff2 ui/assets/RobotoCondensed-Light.woff2 ui/assets/RobotoCondensed-Regular.woff2 ui/assets/RobotoMono-Regular.woff2 ui/trace_processor.wasm ui/trace_to_text.wasm
Traceback (most recent call last):
File "../../gn/standalone/write_ui_dist_file_map.py", line 90, in <module>
sys.exit(main())
File "../../gn/standalone/write_ui_dist_file_map.py", line 71, in main
for fname, digest in digests.iteritems():
AttributeError: 'dict' object has no attribute 'iteritems'
Re-running repeatedly, I only seem to get hangs after this, with no output. (I tried 10 times or so manually).
[pid 4719] <... wait4 resumed> 0x7f93ded4bcf4, WNOHANG, NULL) = 0
[pid 4707] <... fcntl resumed> ) = 0x8000 (flags O_RDONLY|O_LARGEFILE)
[pid 4719] futex(0x7f93dffa9d8c, FUTEX_WAIT_BITSET_PRIVATE, 0, {tv_sec=86943, tv_nsec=190114030}, 0xffffffff <unfinished ...>
[pid 4707] fstat(25, {st_mode=S_IFREG|0644, st_size=2858, ...}) = 0
[pid 4707] read(25, "#!/usr/bin/env python\n# Copyrigh"..., 4096) = 2858
[pid 4707] close(25) = 0
[pid 4707] lseek(23, 0, SEEK_SET) = 0
[pid 4707] futex(0x7f93dffa9d8c, FUTEX_WAKE_PRIVATE, 1 <unfinished ...>
[pid 4719] <... futex resumed> ) = 0
[pid 4707] <... futex resumed> ) = 1
[pid 4719] futex(0x7f93dffa9d90, FUTEX_WAIT_PRIVATE, 2, NULL <unfinished ...>
[pid 4707] futex(0x7f93dffa9d90, FUTEX_WAKE_PRIVATE, 1 <unfinished ...>
[pid 4719] <... futex resumed> ) = -1 EAGAIN (Resource temporarily unavailable)
[pid 4707] <... futex resumed> ) = 0
[pid 4719] futex(0x7f93dffa9d90, FUTEX_WAKE_PRIVATE, 1 <unfinished ...>
[pid 4707] read(23, <unfinished ...>
[pid 4719] <... futex resumed> ) = 0
[pid 4707] <... read resumed> "#!/usr/bin/env python\n# Copyrigh"..., 8192) = 2858
[pid 4719] wait4(4709, 0x7f93ded4bcf4, WNOHANG, NULL) = 0
[pid 4719] wait4(4708, 0x7f93ded4bcf4, WNOHANG, NULL) = 0
[pid 4707] futex(0x7f93dffa9d88, FUTEX_WAIT_BITSET_PRIVATE, 0, {tv_sec=86943, tv_nsec=191437734}, 0xffffffff <unfinished ...>
[pid 4719] futex(0x7f93dffa9d88, FUTEX_WAKE_PRIVATE, 1) = 1
[pid 4707] <... futex resumed> ) = 0
[pid 4719] poll([{fd=9, events=POLLIN}, {fd=10, events=POLLIN}, {fd=11, events=POLLIN}, {fd=13, events=POLLIN}, {fd=15, events=POLLIN}, {fd=17, events=POLLIN}, {fd=19, events=POLLIN}, {fd=21, events=POLLIN}, {fd=5, events=POLLIN}, {fd=7, events=POLLIN}], 10, -1 <unfinished ...>
[pid 4707] futex(0x7f93dffa9d90, FUTEX_WAKE_PRIVATE, 1) = 0
[pid 4707] close(23) = 0
[pid 4707] write(2, "Traceback (most recent call last"..., 311 <unfinished ...>
[pid 4699] <... ppoll resumed> ) = 1 ([{fd=5, revents=POLLIN}])
[pid 4707] <... write resumed> ) = 311
[pid 4699] rt_sigpending([], 8) = 0
[pid 4699] read(5, "Traceback (most recent call last"..., 4096) = 311
[pid 4699] ppoll([{fd=5, events=POLLIN|POLLPRI}], 1, NULL, [], 8 <unfinished ...>
[pid 4707] getpid() = 4707
[pid 4707] futex(0x7f93e01df000, FUTEX_WAIT_BITSET|FUTEX_CLOCK_REALTIME, 0, NULL, 0xffffffff
I can see from the strace that something has just read a python backtrace from a pipe, but it doesn't get as far as propagating that stack trace to stderr. I suspect it was the trace above which I was lucky to observe?