Git Product home page Git Product logo

Comments (4)

warrenfalk avatar warrenfalk commented on August 17, 2024

Yes, I am the author of that, although the version inside rocksdb-sharp is current and that repo is obsolete... I should probably update it. It solves more than just the x86/x64 issue, I think.

I've been down this road now a few times and I'm willing to keep going down it. As far as I can tell, native runtime dependencies (i.e. native code needed at runtime) is still not a solved problem in the .Net ecosystem. My approach isn't a full solution, but neither is LibUV's approach. I've seen some others also which are also not solutions. But right now mine seems to me to offer the best tradeoffs. Why I don't think the problem is yet solved requires a more detailed answer which I'll provide later when I have some more time (see some later comments of #18, in answer to alexvaluyskiy, for a little bit more detail but perhaps we can use your issue here as the dedicated forum to discuss the issue). To summarize quickly, though, the core of the issue is the difference between what is "build" layer and what is "deploy" layer in managed toolchains. Just for example, if I create a managed library, I can build it on any platform and the resulting binary can be consumed by any platform. But if I add to my library a dependency on libuv, then if I'm not mistaken, where I can run my app now suddenly depends on where I built it - because NuGet will decide at build time what native binary to bundle with it. It's even worse than that in some scenarios, but that's the summary.

If I'm mistaken about how libuv's solution works, maybe you or someone on that team can set me straight. I still have yet to find anyone with a really good best practice out there on how to deal with native binaries. But maybe it exists.

I think I will type up a more clear description of what I think the problem is, though. Because if you can come up with a more elegant solution to native runtime dependencies, whether it be tweaks, a total replacement, or some hybrid, I'm all ears.

from rocksdb-sharp.

mkosieradzki avatar mkosieradzki commented on August 17, 2024

Thanks a lot for your answer. I am definitely not an expert in area of the interop, however I am interested in different solutions for handling this case.

First of all I think the problem is far more complex as .NET Core opens completely new scenarios including AOT compilation of .NET assemblies and also completely xplat interop for existing native libraries.

All those mechanisms don't seem to be documented very well what makes it more difficult.

But I did some tests again and you are definitely right when it comes to the full .NET Framework. Depending on config there appropriate runtime is copied to the output folder.

But good news when using .NET Core. When publishing .NET Core application (with portable preset) all runtimes are copied to the runtimes sub-folder in output folder.

When it comes to .NET Core - it will take care of loading appropriate native library during runtime. (I have tested it on .NET Core 2.0 preview 2 tooling) - not sure about 1.0 and 1.1.

There is also this new nugget packaging mechanism I don't fully understand yet, that generates separate runtime nuget packages.

So when thinking about this specific project:
.NET Framework is mostly windows (and some mono)
There is currently a strong preference for x64 for RocksDB
We can expect a lot of .NET Core especially in the xplat area.
We can expect also AOT compilation getting more traction for .NET Core soon...

The ideal solution would be:

  • Use auto-native-import for .NET Framework and mono
  • Use "runtimes" and P/Invoke for .NET Core (both JIT and AOT).

IMO second good solution would be:

  • Force x64 on .NET Framework and mono
  • Use "runtimes" and P/Invoke for .NET Core (both JIT and AOT).

from rocksdb-sharp.

warrenfalk avatar warrenfalk commented on August 17, 2024

I like the second solution. I don't want to maintain two versions of native function declarations. That would be a huge pain. I think the second solution might be achievable.

I was unaware of .NET Core's publishing behavior and ability to load from the runtimes folder structure. This is good news indeed. I have verified that 1.1 also will construct the runtimes folder on publish (but haven't confirmed that it also loads from the correctly - let's assume it does for now).

I am also fine with ditching all 32 bit support except on the Mac. Windows doesn't have it, and Linux doesn't need it. It is necessary on Mac because Mono is still 32 bit by default which makes prototyping rocksdb there a huge pain otherwise. But the good news is that Mac is the one platform where you can create one library that supports both 32 and 64 bit.

So then this allows us to copy librocksdb.so, librocksdb.dylib and rocksdb.dll to the root of the build output which is enough for .NET Framework and Mono's P/Invoke to be able to pick up the right one for the platform based on extension. (Note: this means it will probably not be possible to give a helpful error message to those .Net Framework developers who get "BadImageFormatException: An attempt was made to load a program with an incorrect format" when their IDE defaults their project to 32 bit).

So then the remaining problem is that the .NET Framework's build and publish tools will create a build product only for the platform they are building on. And so today if I add a package dependency on libuv to my project targeting net45 and then publish it, the published output will not run on debian because it will have rocksdb.dll and lack librocksdb.so. This is possibly the only remaining problem we have left to solve.

The RocksDbNative package was the solution for this before. It solved this by ensuring all native binaries are copied to the publish output even in a .NET Framework environment. If we were to stop deploying separate 32-bit native binaries then it could be modified to copy only 64 bit version and copy them to the build output root so that P/Invoke will work.

Originally RocksDbSharp and RocksDbNative were separate so that a developer could either install only RocksDbSharp and rely on the platform's own rocksdb installation or install both and bundle the dependency directly. I now think that it would be better to create a RocksDbSharp package that includes the native binaries. If a package without binaries is needed in the future, it could be created then.

So a possible solution is this:

  1. Switch to P/Invoke alone.
  2. Add the native binaries to the RocksDbSharp package using runtimes folder.
  3. Remove the 32 bit only binaries from RocksDbNative and have it copy the binaries to the output root.
  4. Build the mac native as dual 32/64 bit.

It can be used like this:

  1. Developers building for .NET Core will add just depend on RocksDbSharp.
  2. Developers building for .Net Framework or Mono will be forced to use 64 bit (except on Mac) and if they don't care about portable output, then they will also depend on just RocksDbSharp.
  3. Developers building for .Net Framework or Mono who want portable output need to also depend on RocksDbNative.

One possible problem is that RocksDbSharp when publishing .NET Framework will want to copy the native binary for the current build environment to the publish output root. If RocksDbNative is also present, then it will also try to put a file with the same name there. I don't know how the tools deal with this. We'd have to ensure this isn't a problem.

Have I missed anything?

from rocksdb-sharp.

mkosieradzki avatar mkosieradzki commented on August 17, 2024

I think that's quite complete analysis.

We can achieve most of the described scenarios using one or more nuget packages.

There is only one point missing I didn't mention before. It's static linking against vcredist. I am basically against this approach despite I love the idea of zero-config deployment.

What are the reasons not to link statically against vcredist:

  1. Often RocksDB isn't the only one native library inside entire application. If all libraries were linked statically it would basically mean a lot duplication and memory waste inside a single application.
  2. VCredists are serviceable - it means they are subject to security updates.
  3. It decreases the ability to reuse memory image of vcredists between different applications as well.

IMO it would be a good idea to stop static linking. If one wants zero-config deployment they can add nuget package with vcredist DLLs (again if we have switched to P/Invoke it will work out-of-the-box).

I know it's controversial and I've seen you have put some effort into static linking however I think it's worth the consideration. Even project that huge like dotnet Core requires VCRedist. (Maybe it's a good idea to link against the same VCRedist version required by the .NET Core already? :)

from rocksdb-sharp.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.