Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For example you want to depend on SDL.

You add the "derelict-sdl2" package to your DUB dependency list. DUB will download and build that "derelict-sdl2" library. This is a small library that will load functions pointers in the SDL2 dynlib (aka dynamic loading).

You'll still have to distribute your cross-platform app with SDL2, however there is no linker option, the linker doesn't need to know about an import library. Voila, same build for all platforms.

Conversely in C and C++ you would probably only have static bindings as a choice, and it can be a pain for cross-platform builds. On Linux, it has to be packaged. On Windows, you must find the right .lib.

There can be considerable list of libraries in linker settings, essentially because there is no package manager that makes dependencies something _composable_: dependencies would then leak on linker settings across the chain. C++ forums are full of people failing to build, that's not the case with D.

Documentation: http://derelictorg.github.io/loading/loader/



How is this dynamic loading implented? Is it effectively a cross-platform wrapper over LoadLibrary/GetProcAddress/etc. (and their POSIX equivalents) or is there more to it? Is it possible to make link time optimization work with dynamic loading?


> Is it possible to make link time optimization work with dynamic loading?

No, but nor can you with regular ol' linking against shared libraries. Or against static libraries (except for some mundane stuff like function reordering). LTO requires toolchain support and occurs before code generation.


> Is it effectively a cross-platform wrapper over LoadLibrary/GetProcAddress/etc.

Yes it's just that.

> Is it possible to make link time optimization work with dynamic loading?

Not that I know of (not sure).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: