Git Product home page Git Product logo

proxy-wasm-cpp-host's People

Contributors

bianpengyuan avatar bryanmcquade avatar chaoqin-li1123 avatar dio avatar gbrail avatar ingwonsong avatar jamesmulcahy avatar jplevyak avatar keith avatar kfaseela avatar knm3000 avatar kyessenov avatar lum1n0us avatar martijneken avatar mathetake avatar mpwarres avatar phlax avatar phmx avatar piotrsikora avatar q82419 avatar riverphillips avatar ryanapilado avatar sitano avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

proxy-wasm-cpp-host's Issues

Flaky WasmEngines/TestVm.WasmMemoryLimit/v8 on macOS x86_64

As mentioned in: #310 (comment)

WasmEngines/TestVm.WasmMemoryLimit/v8 on macOS x86_64 is "flaky", running this 100 times (several times) got 3 failures in average.

[ RUN      ] WasmEngines/TestVm.WasmMemoryLimit/v8
TRACE from integration: [host->vm] infinite_memory()


#
# Fatal error in external/v8/src/heap/array-buffer-sweeper.cc, line 249
# Debug check failed: old_.bytes_ >= bytes (12 vs. 229965824).
#
#
#
#FailureMessage Object: 0x7ff7bd451e40
==== C stack trace ===============================

    0   runtime_test                        0x00000001067c002e v8::base::debug::StackTrace::StackTrace() + 30
    1   runtime_test                        0x00000001067c0065 v8::base::debug::StackTrace::StackTrace() + 21
    2   runtime_test                        0x0000000105a5b56e v8::platform::(anonymous namespace)::PrintStackTrace() + 30
    3   runtime_test                        0x000000010678a0a8 V8_Fatal(char const*, int, char const*, ...) + 312
    4   runtime_test                        0x000000010678998c v8::base::(anonymous namespace)::DefaultDcheckHandler(char const*, int, char const*) + 44
    5   runtime_test                        0x000000010678a117 V8_Dcheck(char const*, int, char const*) + 39
    6   runtime_test                        0x000000010375a3a5 v8::internal::ArrayBufferSweeper::Detach(v8::internal::JSArrayBuffer, v8::internal::ArrayBufferExtension*) + 325
    7   runtime_test                        0x00000001038ae9cc v8::internal::Heap::DetachArrayBufferExtension(v8::internal::JSArrayBuffer, v8::internal::ArrayBufferExtension*) + 60
    8   runtime_test                        0x0000000103e726bc v8::internal::JSArrayBuffer::Detach(bool) + 188
    9   runtime_test                        0x0000000104aebc14 v8::internal::WasmMemoryObject::Grow(v8::internal::Isolate*, v8::internal::Handle<v8::internal::WasmMemoryObject>, unsigned int) + 2724
    10  runtime_test                        0x00000001046c45f8 v8::internal::__RT_impl_Runtime_WasmMemoryGrow(v8::internal::RuntimeArgumentsWithoutHandles, v8::internal::Isolate*) + 456
    11  runtime_test                        0x00000001046c4403 v8::internal::Runtime_WasmMemoryGrow(int, unsigned long*, v8::internal::Isolate*) + 227
    12  runtime_test                        0x0000000102fa8ef9 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_NoBuiltinExit + 57
    13  runtime_test                        0x000000010303543e Builtins_WasmMemoryGrow + 62
================================================================================

Migrate runtime builds from Envoy repository to cpp-host

Runtime build issues/handling should be decoupled from host implementations, e.g. the link time symbol issue envoyproxy/envoy#14012

TODOs:

Related to #49

Consider adding an ABI call to get unique id for each VM

This will allow extension to selectively enable logic at some VMs. For example, for a local rate limit extension, we'd only want one VM to do token bucket refilling, and VM uid enable such a use case by register the uid at shared data.

Add a thread agnostic class to manage wasm vms and plugins

Currently, getOrCreateThreadLocalWasm and getOrCreateThreadLocalPlugin use two static thread local hash maps to bookkeep the existing healthy wasm vms and wasm plugins. This is not compatible with envoy's thread local slot and prevent the wasm lifetime management code to be reused. It maybe cleaner to refactor the management of wasm vm and plugin into a WasmManager class which is thread agnostic and manage the lifetime of wasm vm and wasm plugins. Then the WasmManager can be use as a thread local variable in a more flexible way and we can put all the wasm management code there. The class can has interface that looks like

class WasmManager {
  WasmHandleSharedPtr getWasm(string vm_key);
  WasmHandleSharedPtr createWasm(string vm_key, ...);
  PluginHandleSharedPtr getPlugin(string vm_key, string plugin_key);
  PluginHandleSharedPtr createPlugin(wasm_handle, plugin, ...);
private:
  hash_map<string, WasmHandleSharedPtr> wasms_;
  hash_map<string, PluginHandleSharedPtr> plugins_;
};

Implement DWARF parser for better stack traces

Luckily our SDK languages except AsmScript are LLVM based ones so we have .debug_line custom section as long as modules compiled with debug info.

I think it's really useful for users if we have minimum implementation of a parse for it and use it for better stack traces (demangled symbols with file names) when the section is available.

This may come with some computational cost so this would be trade-off between mangled-yet-effecient stack traces with name custom section vs demagnled-yet-expensive one.

I have WIP code in my local but yet to be able to finish until I have much cycles to work on this. maybe we could use LLVM's full fledged DWARF parser internally used for llvm-dwarfdump but I think it would be too much just for parsing debug_line.

edited: If we want to get function names, then we have to parse debug_info as well. I wonder if V8 already has API(s) for getting DWARF informations.

Add a one-shot inter-thread notification mechanism

(Not sure what repo to put this idea in...)

I'd like to be able to use a singleton module to store and atomically handle shared state for the filters running on all the Envoy threads. Ideally, it'd be great to support a way to send a "request" to a singleton WASM module running as a bootstrap service, and get a reply. Basically, it's an inter-thread RPC for WASM modules.

This works today (or it will with PR #36 fixed) with the existing queues. For instance, in a filter's "on_http_request" method, we could dequeue from a queue with a unique name that's specific to that thread. This is fine as long as the thread is only ever processing one request at a time, but it's not. So, each thread needs a pool of queues, or some other mechanism to prevent response swapping, and now things are starting to get complicated.

A nicer way to solve this might be using either a "temporary queue," or some sort of one-shot "wait for reply" method. The idea would work something like this:

  1. WASM service (running as a singleton using the bootstrap extensions) registers a queue
  2. In the filter's proxy_on_ method, register the temporary queue or reply thing, and get back an ID
  3. The filter sends a message to the WASM service that includes the temporary queue ID
  4. When the WASM service is done it enqueues its response using the temporary ID
  5. The filter can be notified using the existing on_queue_ready callback, or some new one

The actual mechanism could just be one new ABI:

proxy_register_temporary_queue(name) -> token

The resulting queue would be automatically deleted when the context exits.

The following is even more explicit but it's pretty clean:

// Get a unique token where a reply can be delivered by any thread
proxy_register_reply_slot() -> token
// Put data in the reply slot and notify the context that created it
proxy_deliver_reply(token, data)
// Called to let the context know that a reply is available
proxy_on_reply_available(context_id, token)

Does anyone else need to solve this problem? Let me know. Thanks!

Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_get

I run a wasm on isito 1.9 got some error,the following is a detailed log:

2022-05-18T10:18:02.435367Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_get
2022-05-18T10:18:02.435389Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_dir_name
2022-05-18T10:18:02.435401Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.path_open
2022-05-18T10:18:02.435407Z	error	envoy wasm	Wasm VM failed Failed to initialize Wasm code
2022-05-18T10:18:02.436387Z	critical	envoy wasm	Plugin configured to fail closed failed to load
2022-05-18T10:18:02.436531Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:02.436546Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:02.436899Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:03.768273Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_get
2022-05-18T10:18:03.768293Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_dir_name
2022-05-18T10:18:03.768296Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.path_open
2022-05-18T10:18:03.768301Z	error	envoy wasm	Wasm VM failed Failed to initialize Wasm code
2022-05-18T10:18:03.769115Z	critical	envoy wasm	Plugin configured to fail closed failed to load
2022-05-18T10:18:03.769124Z	critical	envoy wasm	Plugin configured to fail closed failed to load
2022-05-18T10:18:03.769133Z	critical	envoy wasm	Plugin configured to fail closed failed to load
2022-05-18T10:18:03.769176Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:03.769181Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:03.769183Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:03.860205Z	warn	Envoy proxy is NOT ready: Get "http://127.0.0.1:15000/stats?usedonly&filter=^(cluster_manager\\.cds|listener_manager\\.lds)\\.(update_success|update_rejected)$": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
2022-05-18T10:18:04.085255Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_get
2022-05-18T10:18:04.085278Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_dir_name
2022-05-18T10:18:04.085281Z	error	envoy wasm	Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.path_open
2022-05-18T10:18:04.085286Z	error	envoy wasm	Wasm VM failed Failed to initialize Wasm code
2022-05-18T10:18:04.086084Z	critical	envoy wasm	Plugin configured to fail closed failed to load
2022-05-18T10:18:04.086140Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:04.086144Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:04.086147Z	warning	envoy config	gRPC config for type.googleapis.com/envoy.config.core.v3.TypedExtensionConfig rejected: Unable to create Wasm HTTP filter 
2022-05-18T10:18:04.088679Z	info	Initialization took 2.352548864s
2022-05-18T10:18:04.088698Z	info	Envoy proxy is ready

My EnvoyFilter :

apiVersion: networking.istio.io/v1alpha3
kind: EnvoyFilter
metadata:
  labels:
    app: mall-admin
  name: desensitize-mall-admin
  namespace: mall

spec:
  configPatches:
  - applyTo: HTTP_FILTER
    match:
      listener:
        filterChain:
          filter:
            name: envoy.http_connection_manager
      proxy:
        proxyVersion: ^1\.9.*
    patch:
      operation: INSERT_BEFORE
      value:
        config_discovery:
          config_source:
            ads: {}
            initial_fetch_timeout: 0s
          type_urls:
          - type.googleapis.com/envoy.extensions.filters.http.wasm.v3.Wasm
        name: desensitize-mall-admin
  - applyTo: EXTENSION_CONFIG
    match: {}
    patch:
      operation: ADD
      value:
        name: desensitize-mall-admin
        typed_config:
          '@type': type.googleapis.com/udpa.type.v1.TypedStruct
          type_url: type.googleapis.com/envoy.extensions.filters.http.wasm.v3.Wasm
          value:
            config:
              configuration:
                '@type': type.googleapis.com/google.protobuf.StringValue
                value: |
                  {
                    
                  }
              vm_config:
                code:
                  remote:
                    http_uri:
                      uri: http://10.10.13.47:2333/wasm/desensitize.wasm
                runtime: envoy.wasm.runtime.v8
                vm_id: desensitize-mall-admin
  workloadSelector:
    labels:
      app: mall-admin

When i run on istio 1.13,services run ok.
My wasm build by tinygo 1.17.

Who can help me???

tcp_metadata_exchange proxy test fails on big-endian platforms

After we had introduced big-endian support in proxy-wasm #198 and fixed
#197 the following new error started happening (regression of #198):
tcp_metadata_exchange test from proxy repo fails on big-endian platform (envoy binary crash) https://github.com/istio/proxy/blob/master/test/envoye2e/tcp_metadata_exchange/tcp_metadata_exchange_test.go

go test ./test/envoye2e/tcp_metadata_exchange -v
...

[2022-07-25 11:05:29.028][240769][critical][assert] [external/envoy/source/exe/main_common.cc:84] panic: out of memory
[2022-07-25 11:05:29.091][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:104] Caught Aborted, suspect faulting address 0x3ac8100000000
[2022-07-25 11:05:29.226][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:91] Backtrace (use tools/stack_decode.py to get line numbers):
[2022-07-25 11:05:29.226][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:92] Envoy version: b977767d8be6ce88e1956cd1f6165242aab2a572/1.20.6/Modified/RELEASE/OpenSSL
[2022-07-25 11:05:29.256][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:98] #0: [0x20025f33490]
[2022-07-25 11:05:29.257][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #1: __pthread_kill_implementation [0x2002679eb86]
[2022-07-25 11:05:29.258][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #2: gsignal [0x200267500e0]
[2022-07-25 11:05:29.258][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #3: abort [0x2002672b100]
[2022-07-25 11:05:29.362][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #4: Envoy::MainCommonBase::MainCommonBase()::$_1::__invoke() [0x2aa3cea39bc]
[2022-07-25 11:05:29.372][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #5: tcmalloc::cpp_throw_oom() [0x2aa4076b0a0]
[2022-07-25 11:05:29.381][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #6: tcmalloc::allocate_full_cpp_throw_oom() [0x2aa4076bd1c]
[2022-07-25 11:05:29.399][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #7: flatbuffers::FlatBufferBuilder::CreateString() [0x2aa3ce7140a]
[2022-07-25 11:05:29.408][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #8: Wasm::Common::extractLocalNodeFlatBuffer() [0x2aa3ce6aa66]
[2022-07-25 11:05:29.417][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #9: proxy_wasm::null_plugin::Stats::PluginRootContext::configure() [0x2aa3cdbbcc0]
[2022-07-25 11:05:29.426][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #10: proxy_wasm::null_plugin::Stats::PluginRootContext::onConfigure() [0x2aa3cdbbc18]
[2022-07-25 11:05:29.435][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #11: std::_Function_handler<>::_M_invoke() [0x2aa3d9a2898]
[2022-07-25 11:05:29.445][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #12: proxy_wasm::ContextBase::onConfigure() [0x2aa3d9a83c8]
[2022-07-25 11:05:29.463][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #13: proxy_wasm::WasmBase::configure() [0x2aa3d9c42b4]
[2022-07-25 11:05:29.474][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #14: proxy_wasm::createWasm() [0x2aa3d9c66da]
[2022-07-25 11:05:29.484][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #15: Envoy::Extensions::Common::Wasm::createWasm()::$_8::operator()() [0x2aa3d6d05f4]
[2022-07-25 11:05:29.512][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #16: Envoy::Extensions::Common::Wasm::createWasm() [0x2aa3d6cd924]
[2022-07-25 11:05:29.527][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #17: Envoy::Extensions::NetworkFilters::Wasm::FilterConfig::FilterConfig() [0x2aa3d5ecf9a]
[2022-07-25 11:05:29.537][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #18: Envoy::Extensions::NetworkFilters::Wasm::WasmFilterConfig::createFilterFactoryFromProtoTyped() [0x2aa3d5eab6c]
[2022-07-25 11:05:29.547][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #19: Envoy::Extensions::NetworkFilters::Common::FactoryBase<>::createFilterFactoryFromProto() [0x2aa3d5eb976]
[2022-07-25 11:05:29.557][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #20: Envoy::Server::ProdListenerComponentFactory::createNetworkFilterFactoryList_() [0x2aa3f96c45a]
[2022-07-25 11:05:29.575][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #21: Envoy::Server::ListenerFilterChainFactoryBuilder::buildFilterChainInternal() [0x2aa3f97985a]
[2022-07-25 11:05:29.586][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #22: Envoy::Server::ListenerFilterChainFactoryBuilder::buildFilterChain() [0x2aa3f979526]
[2022-07-25 11:05:29.599][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #23: Envoy::Server::FilterChainManagerImpl::addFilterChains() [0x2aa3f996c4c]
[2022-07-25 11:05:29.609][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #24: Envoy::Server::ListenerImpl::ListenerImpl() [0x2aa3f95f46a]
[2022-07-25 11:05:29.620][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #25: Envoy::Server::ListenerManagerImpl::addOrUpdateListenerInternal() [0x2aa3f974652]
[2022-07-25 11:05:29.630][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #26: Envoy::Server::ListenerManagerImpl::addOrUpdateListener() [0x2aa3f9735d8]
[2022-07-25 11:05:29.658][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #27: Envoy::Server::LdsApiImpl::onConfigUpdate() [0x2aa3f9e8e16]
[2022-07-25 11:05:29.691][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #28: Envoy::Server::LdsApiImpl::onConfigUpdate() [0x2aa3f9ea7f8]
[2022-07-25 11:05:29.718][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #29: Envoy::Config::GrpcSubscriptionImpl::onConfigUpdate() [0x2aa3fb27e70]
[2022-07-25 11:05:29.739][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #30: Envoy::Config::GrpcMuxImpl::onDiscoveryResponse() [0x2aa3fb309a8]
[2022-07-25 11:05:29.748][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #31: Envoy::Grpc::AsyncStreamCallbacks<>::onReceiveMessageRaw() [0x2aa3fb34a3c]
[2022-07-25 11:05:29.763][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #32: Envoy::Grpc::AsyncStreamImpl::onData() [0x2aa3fb570c6]
[2022-07-25 11:05:29.780][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #33: Envoy::Http::AsyncStreamImpl::encodeData() [0x2aa3fb60b4c]
[2022-07-25 11:05:29.789][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #34: Envoy::Router::UpstreamRequest::decodeData() [0x2aa3fbcb9f0]
[2022-07-25 11:05:29.798][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #35: Envoy::Http::ResponseDecoderWrapper::decodeData() [0x2aa3f81b63e]
[2022-07-25 11:05:29.807][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #36: Envoy::Http::Http2::ConnectionImpl::onFrameReceived() [0x2aa3faa37a0]
[2022-07-25 11:05:29.816][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #37: Envoy::Http::Http2::ConnectionImpl::Http2Callbacks::Http2Callbacks()::$_22::__invoke() [0x2aa3fab0824]
[2022-07-25 11:05:29.836][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #38: nghttp2_session_on_data_received [0x2aa3fd088e4]
[2022-07-25 11:05:29.846][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #39: nghttp2_session_mem_recv [0x2aa3fd0ab0a]
[2022-07-25 11:05:29.855][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #40: Envoy::Http::Http2::ConnectionImpl::dispatch() [0x2aa3faa1a00]
[2022-07-25 11:05:29.864][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #41: Envoy::Http::CodecClient::onData() [0x2aa3f8cb478]
[2022-07-25 11:05:29.873][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #42: Envoy::Http::CodecClient::CodecReadFilter::onData() [0x2aa3f8cd5e2]
[2022-07-25 11:05:29.882][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #43: Envoy::Network::FilterManagerImpl::onRead() [0x2aa3fb19de2]
[2022-07-25 11:05:29.891][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #44: Envoy::Network::ConnectionImpl::onReadReady() [0x2aa3fb0f666]
[2022-07-25 11:05:29.900][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #45: Envoy::Network::ConnectionImpl::onFileEvent() [0x2aa3fb0ccc8]
[2022-07-25 11:05:29.908][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #46: std::_Function_handler<>::_M_invoke() [0x2aa3fae79da]
[2022-07-25 11:05:29.917][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #47: Envoy::Event::FileEventImpl::assignEvents()::$_1::__invoke() [0x2aa3fae8d84]
[2022-07-25 11:05:29.926][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #48: event_process_active_single_queue [0x2aa3fcf8b3e]
[2022-07-25 11:05:29.936][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #49: event_base_loop [0x2aa3fcf757e]
[2022-07-25 11:05:29.945][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #50: Envoy::Event::DispatcherImpl::run() [0x2aa3fae512e]
[2022-07-25 11:05:29.973][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #51: Envoy::Server::InstanceImpl::run() [0x2aa3f4cadf6]
[2022-07-25 11:05:29.983][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #52: Envoy::MainCommonBase::run() [0x2aa3cea1312]
[2022-07-25 11:05:29.992][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #53: Envoy::MainCommon::main() [0x2aa3cea1c76]
[2022-07-25 11:05:30.001][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #54: main [0x2aa3ce9d13e]
[2022-07-25 11:05:30.001][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #55: __libc_start_call_main [0x200267317f2]
[2022-07-25 11:05:30.002][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:96] #56: __libc_start_main@GLIBC_2.2 [0x200267318d0]
[2022-07-25 11:05:30.010][240769][critical][backtrace] [external/envoy/source/server/backtrace.h:98] #57: [0x2aa3cc9ee90]
AsyncClient 0x2aa43ecb080, stream_id_: 8385127285791150716

The root cause is that tcp_metadata_exchange test uses null-vm, and in #198 we introduced byte swapping on big-endian for all wasm runtimes (including null-vm). Since null-vm is not using wasm binary, we should not reverse bytes for null-vm (for other wasm runtimes we still need to reverse bytes on big-endian).
Opened #282 to fix it.

Use of local crates possible?

I am trying to use a local version of the wasmtime crate, from inside wasmtime repo locally available on my machine (wasmtime/crates/wasmtime)
I have updated proxy-wasm-cpp-host/bazel/cargo/wasmtime/Cargo.toml dependencies to include local paths according to this documentation

wasmtime = {path = "/home/user/pvt_repo/wasmtime/crates/wasmtime", default-features = false, features = ['cranelift']}
wasmtime-c-api-macros = {path = "/home/user/pvt_repo/wasmtime/crates/c-api/macros"}

after that I run cargo raze --generate-lockfile. When I check crates.bzl , the url field is empty (for example of wasmtime__wasmtime__1_0_0, and this is causing issues when I eventually build envoy, which complains about this.

Is there a way to use local copies of crates / repos ? I have also updated repositories.bzl to reflect local url path with file:/// ()

Tracker for issues reported to the V8 team

Extract common code from runtime implementations

Unfortunately, we have a lot of code duplication and runtime-agnostic code in runtime-specific implementations. At the very least, all bytecode parsing (getCustomSection, getStrippedSource, buildFunctionNameIndex), logic around precompiled modules and signature verification should be extracted to common utils.

Resolve symbol collisions among Wasm runtimes

Migrated from envoyproxy/envoy#14012.

The more runtime we have, The more we want to have single binary to, at least, benchmark all at once.

Using objcopy --prefix-symbols on all the object files of runtimes might be a solution, though I'm not sure what changes would be necessary to be applied to C++ interfaces (V8 and WAVM for examples). Changes to the C interfaces would be trivial.
And This might have side effects in the final binary size?

Relevant to #113

on_vm_start is called multiple times when multiple plugins exist in a single VM.

WasmBase::start is called for every plugin;

auto plugin_context = wasm_handle->wasm()->start(plugin);

and inside of it ContextBase::onStart is called:

if (!context_ptr->onStart(plugin)) {

that results in calling on_vm_start:

if (wasm_->on_vm_start_) {
// Do not set plugin_ as the on_vm_start handler should be independent of the plugin since the
// specific plugin which ends up calling it is not necessarily known by the Wasm module.
result =
wasm_->on_vm_start_(this, id_, static_cast<uint32_t>(wasm()->vm_configuration().size()))
.u64_ != 0;

Do capability restriction on a per-plugin basis

The capability restriction system introduced in #89 is currently working on a per-VM basis, but it should be done on a per-plugin basis because users can provide capability restriction configurations for each plugin, not for each VM.
Let's make the capability restriction work on a per-plugin basis. The implementation design is to be discussed in the comments.

Add support for C++ exceptions

Host implementation should support running Wasm code that was built with C++ exceptions.

Notably, those functions must be implemented:

  • __cxa_begin_catch,
  • __cxa_end_catch,
  • __cxa_free_exception,
  • __cxa_get_exception_ptr,
  • __cxa_uncaught_exceptions,
  • __cxa_find_matching_catch_2, __cxa_find_matching_catch_3, __cxa_find_matching_catch_4,
  • __resumeException,
  • getTempRet0,
  • invoke_ii, invoke_iii, invoke_iiii, invoke_iiiii, invoke_v, invoke_vi, invoke_vii, invoke_viii, invoke_viiii,
  • llvm_eh_typeid_for.

Wasm VM corruption on plugin config update

context: istio/istio#29843

When reconfigure a plugin, VM could be stuck at a bad state:

2021-02-04T22:06:29.003338Z	trace	envoy wasm	[host->vm] proxy_on_request_headers(6, 10, 1)
2021-02-04T22:06:29.003479Z	error	envoy wasm	Function: proxy_on_request_headers failed: Uncaught RuntimeError: function signature mismatch

This happen randomly to some VMs at an Envoy process, so looks like vm is corrupted due to config update.

OnRequestBody not called if onRequestHeaders returns stopIteration

I'll preface this with it could be user error but my scenario is I'm making a grpc call out to some backend service. The backend server will mutate both the headers and the body. So I need to execute both onRequestHeaders and onRequestBody before getting the result.

The problem is it hangs indefinitely, but I'm told it should work. Assuming it did work once then I believe it's related to this change:

#95

Sampe code is here: https://github.com/bladedancer/envoy-wasm-stopitertest/blob/main/wasm-filter/context.cc#L32.

Is this scenario achievable? Is it a bug or the expected behaviour?

Cleanup random bytes

Instead of using RAND_bytes directly in Proxy-Wasm C++ Host, we should get random bytes from the embedding application, which might also have an optimized version of it available.

cc @mathetake

Wasm Runtime not included in vm_key

this could be a problem once we support multiple runtimes because I think users would expect VMs to be created with a specified runtimes they change vm_config.runtime

WASM extension crashes on Big-endian machine

Rust WASM extension https://github.com/maistra/header-append-filter crashes on s390x:

2021-09-07T17:11:51.327902Z     error   envoy wasm      Function: proxy_on_configure failed: Uncaught RuntimeError: memory access out of bounds
Proxy-Wasm plugin in-VM backtrace:
  0:  0x1aa1 - _ZN10serde_json5value2de77_$LT$impl$u20$serde..de..Deserialize$u20$for$u20$serde_json..value..Value$GT$11deserialize17hb4f81bdbc27fce53E.llvm.10413096860284390022
  1:  0x17fa - _ZN10serde_json2de10from_slice17had30ef6d3adf186fE
  2:  0x64a0 - _ZN97_$LT$header_append_filter..HeaderAppendRootContext$u20$as$u20$proxy_wasm..traits..RootContext$GT$12on_configure17h5f96597ce1e2a394E
  3:  0xfffa - _ZN10proxy_wasm10dispatcher10Dispatcher12on_configure17h4a9f78f0de537835E
  4:  0x1220a - proxy_on_configure
2021-09-07T17:11:51.327927Z     error   envoy wasm      Wasm VM failed Failed to configure base Wasm plugin
2021-09-07T17:11:51.328771Z     critical        envoy wasm      Plugin configured to fail closed failed to load

Support for Per Route in Wasm Plugin

At this time, wasm plugins do not support per route like lua plugins. If any wasm plugins open, all traffic will pass through wasm plugins.

This is a totally hard and unacceptable part for us to put wasm in production, because we allow users to open plugins in specific routes, which wasm is not ok with this. I try to use Composite filter and matching api to do this, but as you know, matching api is not stable and it does not support multi-filters to open in one route. 😫

so I think per route is a graceful method to solve this problem and it is an import feature in Production.

@PiotrSikora @mathetake

Wasmtime default features disabled

Hi there,
I'm curious why the default features for wasmtime are disabled?

wasmtime = {version = "0.39.1", default-features = false, features = ['cranelift']}

I tried to look when this change was made, but looks like default features were disabled from the time wasmtime was added as a runtime. I changed it to true for wasmtime 1.0.1 version, and I was able to build it without much issue (had to resolve one duplicate dependency conflict in BUILD.cpufeatures-xxx similar to rustix crate).
Particularly interested since several newer optimizations for wasmtime-1.0.0 (as well as helpful features, e.g. related to profiling) seem to be included in default features.

Or is there a different place to implement these features that I've missed?
Thanks!

Add file system WASI call stubs to support wider Wasm binaries

Some of TinyGo app have started emitting binaries with file system related WASI system calls:

envoy_1  | [2021-11-15 08:13:45.970][1][error][wasm] [source/extensions/common/wasm/wasm_vm.cc:38] Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.path_open
envoy_1  | [2021-11-15 08:13:45.970][1][error][wasm] [source/extensions/common/wasm/wasm_vm.cc:38] Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_get
envoy_1  | [2021-11-15 08:13:45.970][1][error][wasm] [source/extensions/common/wasm/wasm_vm.cc:38] Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_prestat_dir_name

So it would be helpful if we could add stubs for it even though we cannot actually support them. Relevant to #127

Using timer API

Hi,

I was trying to set up a ticker in a WASM filter extension and found out that the
host setTimerPeriod is not yet implemented,

Do you have an estimate on when would this function be implemented?
If no one's working on it, then I would happy to take it.

Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_filestat_get

Describe the bug / error

When I am trying to use the following package: "google.golang.org/protobuf/proto"
And in particular:

func (ctx *setBodyContext) OnHttpRequestBody(bodySize int, endOfStream bool) types.Action {
    _ = proto.String("x")
    return types.ActionContinue
}

I get the following error during Envoy startup:
Failed to load Wasm module due to a missing import: wasi_snapshot_preview1.fd_filestat_get

When I give up this package the error is gone and the module loads successfully.

What is your Envoy/Istio version?

envoyproxy/envoy-dev:ce4845329292c3b896fc0d26d5555e6ab54a30b7

What is the SDK version?

Commit bd6f69563ef44496906d57b61980af76d365b5ca

What is your TinyGo version?

tinygo version 0.25.0 windows/amd64 (using go version go1.19 and LLVM version 14.0.0)

URL or snippet of your code including Envoy configuration

http_filters:
- name: envoy.filters.http.wasm
  typed_config:
  "@type": type.googleapis.com/envoy.extensions.filters.http.wasm.v3.Wasm
    config:
       vm_config:
         runtime: "envoy.wasm.runtime.v8"
            code:
               local:
                  filename: "grpc.wasm"

Additional context (Optional)

Original issue was opened in the Go-SDK repository, redirected to the host repository:
tetratelabs/proxy-wasm-go-sdk#324

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.