Git Product home page Git Product logo

Comments (94)

qti3e avatar qti3e commented on April 28, 2024 68

@hstarorg I'm against having a package manager.

I think other things including npm are more network depended, here is why:

  1. in NPM you must have a connection to easily initing your project.
  2. you need a network to run commands such as npm install, yarn add

Why (i think) importing from URLs is a good idea?

Deno uses a built-in cache system so you don't need to download a bunch of things. (that you already downloaded for your other projects)

We must manage more js files not our own development.

You already do that in NPM!

and it's easier in Deno as you only have to import one file per dependency wherever you need that : )

from deno.

oridark avatar oridark commented on April 28, 2024 46

i think a package manage tool like npm is needed, because we are lazy, we want to make it easy to use, writing urls is really not a happy things to everyone. you want everyone use it, you should make them happy to use itk.

from deno.

mohsen1 avatar mohsen1 commented on April 28, 2024 37

package name in package.json is a shortcut to an actual URL anyways.

from deno.

ry avatar ry commented on April 28, 2024 31

@sandangel The idea is to explore what a simpler runtime could be. It may indeed to too tedious to link to URLs (but Go seems to do fine by it). I think the project looks like it's is going well I will write a source rewriting tool for converting old programs.
For now, no package manager needed or desired.

from deno.

cojack avatar cojack commented on April 28, 2024 27

Guys, without any package manager, we will end up using plenty of lodash source in one single application, so instead of using 4mb of lodash, we're going to hack up the memory up to 40mb because of 10 different packages, use almost the same lodash version, but bundled so every import will take it also.

This is ridiculous.

from deno.

ry avatar ry commented on April 28, 2024 25

The idea with the URL imports is that deno acts as its own package manager. I explicitly want to avoid the need for ancillary tooling.

from deno.

jedahan avatar jedahan commented on April 28, 2024 24

I love the idea of url imports because it makes it easier for us to experiment with ideas like ipfs://<contentHash>, dat://<similarWeirdAddress>, ssb://%<yepthisworkstoo>, and ssb-git://, etc.

from deno.

cemremengu avatar cemremengu commented on April 28, 2024 22

Even if you dont have a package manager there should be a central place to list dependencies.

Are we going to write the absolute url in each file? How does it work?

from deno.

sandangel avatar sandangel commented on April 28, 2024 12

@TehShrike but having a cli tool just makes life easier ^^

from deno.

dummyuser10 avatar dummyuser10 commented on April 28, 2024 12

The main issue I see with the URL-based imports is that it's not possible to upgrade the versions of a given package project-wide, at least not without doing a find-and-replace or via some utility.

Co-locate your dependencies in a single file and re-export them. (e.g. https://github.com/oakserver/oak/blob/master/deps.ts/)

One of my favorite things about package.json was that it gave you a way to easily view all dependencies used in a project/module when browsing/auditing code. Something like a deps.ts file isn't really as good as package.json IMO because it is left to best intentions. There is nothing to stop people from not using a deps.ts file or to have a URL import somewhere within their code even if they do have a deps.ts file. So looking at a deps.ts file isn't really a guarantee that there aren't other deps used elsewhere in the project.

just my 2 cents, but I personally wish Deno had created its own standard/blessed way of managing deps rather than a best-intentions recommendation of putting all your deps in a single file.

from deno.

hmmdeif avatar hmmdeif commented on April 28, 2024 11

My main concern is that url only imports do not guarantee safety for the user. At least with npm we can see if any of our dependencies have been tampered with (sort of, but this way you get even less safety).

I foresee if you keep the url only imports that you'll need some sort of centralised system that contains a checksums of the packages, with approved statuses so that we know that packages held in whatever location have not been tampered with.

Perhaps this isn't a valid concern for now though?

from deno.

 avatar commented on April 28, 2024 11

Npm and its package.json + package-lock.json allow quick security audit of installed packages, including sub-dependencies. With URL import of pre-bundled packages, it's unclear to me how would I do that.

Also, when a dependency version upgrade is guaranteed, do I search over my codebase to update every imports...?

As for the built-in cache argument: if you can cache an URL, surely you can cache the result of an npm install? Pnpm does exactly that, even symlink packages to save disk space and installation time.

from deno.

Soremwar avatar Soremwar commented on April 28, 2024 11

@zuohuadong There is no point in developing a feature that goes against the phylosophy of the language it uses. That is the real issue that ultimately makes Node a bad decision for a runtime.

And let's not get personal, the community also had a said in this.

EDIT: Upvoting your own post would work if we couldn't see who upvoted, boy 😄

from deno.

mlogan avatar mlogan commented on April 28, 2024 11

The main issue I see with the URL-based imports is that it's not possible to upgrade the versions of a given package project-wide, at least not without doing a find-and-replace or via some utility.

Co-locate your dependencies in a single file and re-export them. (e.g. https://github.com/oakserver/oak/blob/master/deps.ts/)

This misunderstands the issue.

In Node, if I depend on A, and A depends on B, and B releases a critical security patch, I can apply that patch immediately by running a single command. I don't have to wait for A to release a new version. And if, as is often the case, I depend on packages A through Z, all of which depend on B, I don't have to wait for dozens of different package maintainers to release new versions before I can be rid of the vulnerable version of B.

As I understand it, in deno, whether you plumb imports through deps.ts or not, an import statement always refers to an exact package version. Therefore the only way to receive updates to indirect dependencies is to first wait for updates to your direct dependencies to be released.

Other languages do things this way, and for all I know it's better that way. But, the existing module culture of node ("make lots of packages, depend on loooooots of packages") will not be able to exist in deno due to this issue.

from deno.

wujohns avatar wujohns commented on April 28, 2024 10

When using go to program, the biggest problem I encountered is package manage.

I am afraid to use a language without official package manage tool. ex: go, it has many tools for
managing package because of the long time missing of official package manage tool(dep improve the current situation)

So I think the result of the “import from url” will be that many new tools are created for connecting npm and dyno. Maybe this result is good for most of the people

from deno.

zuohuadong avatar zuohuadong commented on April 28, 2024 10

However, go also has a package manager. Rust also has a package manager.
IMHO, ry is not suitable for making decisions, like node-gyp, node-callback. He always goes his own way.

from deno.

wujohns avatar wujohns commented on April 28, 2024 9

Version control for package is important. Something like go-dep

from deno.

Soremwar avatar Soremwar commented on April 28, 2024 9

@ahmetcetin Deno has a caching system to prevent this kinda problems, but what you are asking is actually a problem all package managers have. There is no way to guarantee that your whole dependency tree will be available at all times.

This is partially solved through smart development though, sadly both Node and early stage JavaScript (ES5 chills) promoted a style of development that required small libraries for covering small necesities, making your dependency tree a whole mess in no time.

JavaScript is more mature now, so we don't need packages for covering basic needs such as string casting or improving callback hell or what else. I see that most libraries currently developed in Deno only make use of the deno/std libraries and that makes me a little proud 😄

Just remember that Deno doesn't address any kind of dependency management features, it just incorporates the native JS ones to play nicely in your computer.

from deno.

FINDarkside avatar FINDarkside commented on April 28, 2024 8

Using version ranges in your urls mean that your builds aren't reproducible.

from deno.

TehShrike avatar TehShrike commented on April 28, 2024 7

@wujohns Bear in mind that with the third-party service unpkg, you can use URLs to depend on npm packages with the same version ranges that you use in your package.json file, ala https://unpkg.com/deepmerge@~1.4.2 -> https://unpkg.com/[email protected]/dist/cjs.js

from deno.

balupton avatar balupton commented on April 28, 2024 7

i think a package manage tool like npm is needed, because we are lazy, we want to make it easy to use, writing urls is really not a happy things to everyone. you want everyone use it, you should make them happy to use itk.

problem of finding and typing URLs can be solved via vscode plugins, or a siri plugin - no need why that complexity should be in something which goal is to make things lighter

from deno.

Soremwar avatar Soremwar commented on April 28, 2024 7

@zuohuadong I never talked about GoLang, but if you wanna talk about big projects, just open your browser dev tools. Notice the big-ass list of dependencies being imported everytime you browse the internet?

This design pattern works with every single webpage you visit, and half of the time the modules they are importing come from a CDN or else. And yet I bet you can't count with your fingers how many times a webpage broke cause it couldn't find a script(or at least you weren't able to to tell the difference)

Deno exists wherever JavaScript exists. And JavaScript is currently used for web applications, servers, desktop apps, mobile apps, server-less apps of any kind, artificial intelligence, hardware programming, games both web and not-web, console line applications, and I can go on.

Notice something about all the things I mentioned before? Most of them will benefit from the model used by JavaScript, since they don't depend on a package manager to work, so the language can take on the rest of the job

But if you think you'll be better of using another tools, then do it. No one is forcing you to use Deno

from deno.

randfur avatar randfur commented on April 28, 2024 7

People saying deno needs a package manager like NPM... what if you just use NPM as the package manager??
import {dog} from '../node_modules/dog/index.ts';

from deno.

ajimae avatar ajimae commented on April 28, 2024 6

Even if you dont have a package manager there should be a central place to list dependencies.

Are we going to write the absolute url in each file? How does it work?

A workaround for central package manager can easily be implemented locally, for example

// imports.ts
export { serve } from "https://deno.land/[email protected]/http/server.ts";


// server.ts
import { serve } from "./imports.ts";

So the import can hold all your imported dependencies.

However, I am thinking of a more easy and concrete automation that will list all your imported dependencies in one file like the example above, instead of doing it manually, it will list it in a file for you.

from deno.

hstarorg avatar hstarorg commented on April 28, 2024 5

Like go, the problem that miss an unified package management has been highlighted.
I still think import package by url is not good, maybe you should see the following questions.

  1. The code can't run in other user's environment.
  2. Control the depdency version is very difficult.
  3. Sometime, the project dependency by url will be delete uninformed.

from deno.

Macil avatar Macil commented on April 28, 2024 5

Node+npm has a benefit over Deno right now: npm stores the checksums of all packages in package-lock.json, so if you clone a repo that contains package-lock.json (and .gitignores node_modules as usual) and then run npm install, then you know you got the same exact packages that the author intended. Deno doesn't have an equivalent way of locally checking checksums of remote imports afaik.

from deno.

firecrackerz avatar firecrackerz commented on April 28, 2024 5

The package manager is necessary, check out https://github.com/apple/swift-package-manager or https://github.com/rust-lang/cargo.
We can always use PM + Url (if possible)

from deno.

tomitheninja avatar tomitheninja commented on April 28, 2024 5

How about creating a brand new package manager?
Just to get it clear, yes I am just an IT student who probably watched to many yt videos.

As I red in the comments, I think these are the most important factors to consider:
Both centralized and de-centralized solutions have a chance to become unavailable over time.
Furthermore user hosted solutions might have silent changes, although this could be fixed by saving hashes locally, right?

I am not saying that the URL based packing fetching is completely wrong, but I don't think it is a good solution for the community.
In my opinion, this should be used in enterprises. But for that, it should be able to load the packages from ssh, ftp, git,...

But for a normal user, it takes to much effort to find each package on the internet.

So my idea is to create the pied piper package manager. 🐀
It would work like this:
All packages must be named [at]namespace/package/version

For example [at]std/http/1.0.0

  1. A package author can reserve a namespace and download it's private key 🗝 from the official package manager website. Then no one ever should be able to download the key for that namespace ever. Basically the server is just distributes the namespaces.

  2. With the private key the author can sign the package and then publish it to his/her own namespace. He should not be able to upload a new file for the same package version.

  3. Others around the world would immediately download the package and start seeding it.

  4. The end user could just import it.

To get ppl seed the packages, it could give them block chain currency (denos) for that.

And also seeding smaller packages should be preferred, so the package manager will not be a replacement for traditional movie pirating.

This method would solve most of the (or all) problems a red above.
It would be de-centralized, reliable and secure.

from deno.

zuohuadong avatar zuohuadong commented on April 28, 2024 5

@Soremwar

Golang has proved that it is wrong, it is not suitable for large projects.
So, is deno's goal serverless?
If only doing such a small project, why don't developers choose simpler golang or python?
Where does deno exist? front end?

from deno.

ahmetcetin avatar ahmetcetin commented on April 28, 2024 5

@Soremwar so, as I understand it's possible to use depA which depends on depB, right? in this case there is no really much difference between deno and node, except being deno is in the early stage, and everyone plays nicely, so it was the same for node in its early days. when too many people develop too many packages, then it becomes problematic, but that's what the responsibility of any sane developer, to check what they use, and use the packages responsibly.

As long as the platform supports dependency chain, simply you just cannot really prevent a situation like leftpad incident. Indeed, javascript is adding every year new features, no need for a package like leftpad, or promise packages like bluebird, but there will always be some packages that will be necessary today, but won't be necessary tomorrow, and there will be lots of dependant packages to those "future unnecessary" packages. it's also another deep discussion if it's a good thing or a bad thing adding new features frequently to a language though, most of the highly reliable languages don't add too many changes to the language, but left this discussion out now.

About caching, it's nice deno handles the caching of the imports intelligently (i assume it writes somewhere in file system, or using v8's cache - if it supports, dunno, in return writes somewhere in file system), so does node (npm/yarn) in node_modules. in the leftpad incident, nobody who already installed leftpad was affected (we weren't affected, luckily we didn't push any code to production at that day), the problem was when you install dependencies in a new system, like ci/cd or deployments. Assume a guy maintainer of a package just deletes the package from the url that you're importing from, and you want to deploy your code to production in aws in a new ec2 instance, good luck. Of course you can just clone the dependency repo and use it, but then you actually create your own registry/repository of dependencies, then you need to maintain the updates/bug fixes/security fixes, etc from the cloned repo, doesn't sound like a good option also. On top of that you may end up with copyright issues, changes in the license of the packages you use etc.

Having managed registry like npm is not totally solving these issues, but helps a lot. Lesson learned from leftpad incident, now to be able to publish a package in npm, you're accepting that you won't be able to remove your package from npm if your package is a dependency in other packages, which is legally binding, so they can keep serving your package. About the licenses, there are many apps around which just checks your dependencies and lists you the licenses used, so you can decide how to behave properly.

Please don't get me wrong, I'm not saying npm is the solution, just trying to iterate the pros/cons of the approach that deno took. If I understood how deno works wrongly, please correct me.

from deno.

Soremwar avatar Soremwar commented on April 28, 2024 4

I don't consider having too many packages ever as a bad thing XD
In fact as things go by, Deno should technically have way more packages than node, since they are not restricted by a namespace

The point of this discussion was(since its closed but somehow ends up lit every week or so XD), to talk about if Deno required a package manager and the advantages it may bring

Deno does work differently from Node under the hood, but all the basics(modules and such) are there for you in both cases to do whatever you want with them

Thanks for your elaboration in this topic 👋

I hope more people will read this and understand the reasons behind before jumping the gun on blaming Deno for going full on Node

from deno.

mischkl avatar mischkl commented on April 28, 2024 4

The main issue I see with the URL-based imports is that it's not possible to upgrade the versions of a given package project-wide, at least not without doing a find-and-replace or via some utility. I guess import maps will help here but as I understand it they are still in development.

Regarding stable versions until explicitly refreshed: is there some kind of package-lock.json equivalent, or a package-relative place where the cached files are stored? Because if the versions are only "fixed" on the developer's own machine, that's pretty useless.

Additionally regarding built-in bundling, how flexible is this - does it always require every local file that is imported to be bundled together or can multiple entry point files be defined? This is important because for large projects, a dependency tree with hundreds of files is obviously not fit to "publish", but one single bundle is likely to be too big a thing as well - ideally a list of entry point files could be specified to split the code along lines that make the most sense for consumption.

from deno.

matiaslopezd avatar matiaslopezd commented on April 28, 2024 4

I'm still think this way to import packages will be really insecure. Security is first! I mean, over 80% of devs today copy and paste the classic npm i package-name and never check the source code. That could be end on malwares, virus, troyan, etc. in your dev environment. Let's imagine all services that now provide analytics and builds of projects, executing and reading malicious code!

Other way to understand this, is with the known express package. For example, if I want import to Deno:

import { express } from 'https://malicious-cdn.com/libs/express/express.ts'

All it's ok, run perfect and offer the lastest express package... But the package have malicious code that recognize the dev or production environment and can log all HTTP request undetected by developers.

This is 100% possible with NPM, but we know that npm can remove it in Deno not. I see the difference between both easily, but how I can trust in that CDN?

  1. npm i express-2
  2. https://malicious-cdn.com/libs/express/4.17.1/express.ts

I know that Deno have other philosophy of security over runtime, but obviously if a package need write and read permissions the developers will give them.

What plans have about this? Is the cost of decentralization?

Packages over "Deno Proxy"

Could be a optional global package register, but not store the files? Maybe like a proxy for deno packages because with that flow Deno could give some security to developers and block malicious packages.

For example:

import { express } from 'https://malicious-cdn.com/libs/express/4.17.1/express.ts';

I'm still using malicious package but when start app, Deno proxy check the URL if is not in the blacklist packages.

I say optional because can be good idea turn off "Deno proxy" if the developer don't want use it, like with a privates repositories. Also could integrate with a list of trusted domains will not pass to the Deno proxy trusted-domains.json.

from deno.

qti3e avatar qti3e commented on April 28, 2024 3

@sandangel well, I'll dive into this issue next week and see if I can write a Node to Deno converter or not : )

(FYI; Probably I'm gonna parse AST and insert file extensions to module names, also I'm thinking of converting npm imports (package name) to unpkg urls)

If you have any other idea please let me know : )

from deno.

cojack avatar cojack commented on April 28, 2024 3

Do you know guys why every programming language have linter?
You are not able to overcome this without package manager.
Srsly JavaScript was first programming language without linter, and here we go again.
Microsoft did perfect work,create solution with linter and transpiler at once, here we go.

from deno.

kitsonk avatar kitsonk commented on April 28, 2024 3

Not yet, but there is an issue for that. I can't remember what it is. All that does of course say that some file you retrieved has been transmitted successfully. Likely what we need in Deno is to give the end user the ability to know who is providing the module and decide if they trust them or not.

We need to explore security models for remote modules more, but a centralised package manager is not it. A set of centralised signing authorities, maybe.

from deno.

piq9117 avatar piq9117 commented on April 28, 2024 3

Have yall considered nixpkgs to manage dependencies? (https://github.com/nixos/nixpkgs)

from deno.

ahmetcetin avatar ahmetcetin commented on April 28, 2024 3

I don't know if any of you are not working in a project supposed to be completed yesterday. Let's face it, we are living in time we need to prepare something working quickly as a proof of concept / MVP, get approval then continue to develop the product. I worked on so many projects, in many different types of companies, but this pattern never changes.

@ry I understand the feelings against npm, I totally agree with it, but when time comes to prepare the MVP, you have to start quick and make decision about what you will use. And unfortunately these decisions dictate what will be used in the project later on significantly, like node vs deno, express vs hapi, etc. You cover the happy path in MVP, then worry about problems later.

What I do generally, find the code out there that you can use as much as possible for MVP, using google, npm search, then later on keep the reliable ones like express and lodash as dependency and make the other codes as part of your project later. Is it perfect approach? Obviously not, but works ok.

Import from url approach is really a good idea, and no matter what it should stay as it is. On the other hand, there is enormous number of packages are registered in npm, you can find few libs, if not hundreds, for almost any use case.

I understand the concerns about keeping the code clean, I support with my whole heart. But if deno would be npm-compatible, I believe it would attract almost whole node community here. And I believe node wouldn't be such popular as today without npm and the community members like tj in the early days.

I read comment about using bower, well, bower requires to use npm to install bower nowadays, so why bother. I think to be able to use npm packages should be at least optional, and users should decide to stick on url import or npm import. just my 2c.

from deno.

zuohuadong avatar zuohuadong commented on April 28, 2024 3

@Soremwar
I might have done that a few years ago, but I now use webpack.
But it is now 2020.
I use React & Angular. I love "ng update".
  On the back end, I think that deno is no different from toys.
I recommend kotlin & go.
Deno does not replace anything, it just recreates an ASP+ Typescript.

Thanks again for your serious reply.

from deno.

juanpenarandaosf avatar juanpenarandaosf commented on April 28, 2024 3

Another comment here from someone who sees benefit in the package.json listing all the different packages that a project has.

One thing that I wonder is, what happens if we need to upgrade a package to a new version? In node we could update the package.json and keep the "require('express')" for example. But in deno I understand we should go through every file and update it there. To me there should be a better way to address this issue because we could have the case in which some file is left unchanged and an old api is used unexpectedly or a security issue remains.

I like the solution from @kitsonk using the deps.ts, but I see we as developers should reexport the functions / classes from each one of our dependencies. In terms of maintenance I think deno could have something like the package.json to make this mapping, for example:

{
   "dependencies": {
       "react": "https://the-cdn-url-here@version"
    }
}

and then in the file just use import { something } from 'react' as normal. Also, I think this could improve the readability and could help more people transition to deno.

Keep up the good work guys! :)

from deno.

hstarorg avatar hstarorg commented on April 28, 2024 2

@jedahan Use this style, your code should be very hard to share.

from deno.

cojack avatar cojack commented on April 28, 2024 2

@balupton u're right, but importing BUNDLED package from fe: unpkg like mention above, it was not created for that, and it won't return a tree shacked package, but BUNDLED with all what his dependency.

from deno.

kitsonk avatar kitsonk commented on April 28, 2024 2

Srsly JavaScript was first programming language without linter

Not sure what your point is, but this statement is patently false. There were lots of 3GL languages that don't/didn't have a linter. Let's not bring up 2GL or 1GL languages.

from deno.

hstarorg avatar hstarorg commented on April 28, 2024 2

@balupton For 1, also about the dep version control. Version in url is not mandatory, that's is a risk.

from deno.

crowlKats avatar crowlKats commented on April 28, 2024 2

@juanpenarandaosf you are asking for import maps basically, which are already a thing

from deno.

TylerBre avatar TylerBre commented on April 28, 2024 2

It's been mentioned a few times, but it seems some of you fail to read the documentation. Deno has support for "importmaps", which serve to alias import URL's to human-readable strings or "package names".

https://deno.land/manual/linking_to_external_code/import_maps

For the lazy, this means you create a file called "import_map.json" and populate it with your projects dependencies. Shockingly (coincidence?) similar to how package.json works...

By definition, this is package management. Just because there is no magical CLI to update and manage this file for you, does not mean there is no way to manage and centralize your project's dependencies.

Further, if any of you wish to create a tool to do this, you are more than welcome.

Edit: for the people comparing a JS runtime to Go and Kotlin... Really?

from deno.

sandangel avatar sandangel commented on April 28, 2024 1

@qti3e I'm curious, so what is the strategy to reuse thousands of js, ts library out there when they are using normal require() // js and import // ts

from deno.

kitsonk avatar kitsonk commented on April 28, 2024 1

Are we going to write the absolute url in each file? How does it work?

Ryan's talk makes it explicit that a package server would handle semantic versioning as part of the URL.

after the version of npm3, the “black hole” is flattened

It just looks different, but in practice, it isn't very "flat".

from deno.

jedahan avatar jedahan commented on April 28, 2024 1

@hstarorg hard to share in that not everyone will have a system-level resolver, I agree. But there are so many good solutions for this, and what I like is that it seems deno is making decisions that don't explicitly stop us from trying any of them:

  • have go/rust/c++ resolve the protocol itself directly (built in fetch that makes it easier to share without installing anything)
  • passthrough to system-level resolver (hard to share if you don't have something to handle those protocols, so I agree with you)
  • somthing else in the future we haven't thought of?

from deno.

balupton avatar balupton commented on April 28, 2024 1

@cojack that is what tree shaking is for

from deno.

hayd avatar hayd commented on April 28, 2024 1

@keyboard-clacker you can do this as a deps.ts file which imports/exports the relevant urls (for example). This means you only need to update the deps.ts file if it's a non-breaking change.

An alternative is to use importmap.

from deno.

Soremwar avatar Soremwar commented on April 28, 2024 1

Deno won't support any functionality that requires to change current JavaScript behavior. Decisions like that continue to cause problems in Node development.

If you want functionality like that, you can check import maps, as has been discussed before.

from deno.

kitsonk avatar kitsonk commented on April 28, 2024 1

People can already tap into npm packages for Deno without needing a package manager. They would just use one of the various CDNs (or create their own). Pika.dev/cdn and jspm.io both bundle packages into single file distributables. Unpkg.com makes available any npm package and if that package includes ES modules or a single file bundle, then Deno usually can consume them. This is not a problem a package manager needs to solve for Deno.

from deno.

nsisodiya avatar nsisodiya commented on April 28, 2024 1

importmaps

from deno.

kitsonk avatar kitsonk commented on April 28, 2024 1

The main issue I see with the URL-based imports is that it's not possible to upgrade the versions of a given package project-wide, at least not without doing a find-and-replace or via some utility.

Co-locate your dependencies in a single file and re-export them. (e.g. https://github.com/oakserver/oak/blob/master/deps.ts/)

from deno.

cknight avatar cknight commented on April 28, 2024 1

@dummyuser10 check out deno info mymodule.ts, for example, which will print all dependencies of your module, regardless of where they are kept.

from deno.

sandangel avatar sandangel commented on April 28, 2024

@qti3e will the code autocompletion or IDE relate stuff work with that?

from deno.

qti3e avatar qti3e commented on April 28, 2024

@sandangel not yet, but it's not a hard task, there are only two differences between a Deno file and a regular ts file:

  1. remote URL import
  2. top-level await

from deno.

sandangel avatar sandangel commented on April 28, 2024

@ry and the tool you have just said works just like a package manager ^^. It downloads the package, converts it so it can be import with url, adds the import to the source code. Am I wrong? :)

from deno.

hstarorg avatar hstarorg commented on April 28, 2024

How to process multilayer deps? For example: A depend B, B depend C?

from deno.

yorkie avatar yorkie commented on April 28, 2024

Deno is browser-compatible, so any packager(bower, unpkg and npm) are available, too.

from deno.

wujohns avatar wujohns commented on April 28, 2024

after the version of npm3, the “black hole” is flattened

from deno.

hstarorg avatar hstarorg commented on April 28, 2024

@wujohns Many options that will difficult to choose. Split package management will cause the package in many tools.

from deno.

wujohns avatar wujohns commented on April 28, 2024

@hstarorg So the official tool or central repository is important. Fortunately we have npm, the new tools's core will be it (there is actually only one choice)

from deno.

hstarorg avatar hstarorg commented on April 28, 2024

@wujohns A npm like tool is very important, also need fix some npm defect.

from deno.

balupton avatar balupton commented on April 28, 2024

@cojack importing bundles seems to be an non optimum solution, as it reduces caching of modules - however, good version range caching, like what unpkg does, seems to reduce the download burden here - so combine unpkg caching with a tree shaked bundle for your app, and I don't see what the problem is

@hstarorg

Seems an ipfs proxy cache for unpkg would solve 2 and 3.

For 1, can you elaborate on that.

from deno.

hayd avatar hayd commented on April 28, 2024

For 2. the solution is to have version in the url.
It's always the case that 3. is a risk, on npm/unpkg too, things can be removed or the site can go offline.
You can mitigate by pre-fetching, e.g. vendorizing .deps and committing it to the repo (or docker image). That way at least this wouldn't be an issue at either deploy/run-time on a production system.

from deno.

hayd avatar hayd commented on April 28, 2024

At least with npm we can see if any of our dependencies have been tampered with.

I don't think this is any stronger than any other registry would be.

There's definitely a trust issue with using random webpages on the internet to host your code, registries like deno.land/x/ seem natural. The difference with deno is you don't need to bake these registries into the runtime - you can have registry competition (including on security e.g. checksums).

from deno.

hstarorg avatar hstarorg commented on April 28, 2024

package name in package.json is a shortcut to an actual URL anyways.

@mohsen1
Also can ensure package availability and control the package version.

from deno.

Tundon avatar Tundon commented on April 28, 2024

So I know there is already enough discussion about whether a package management tool is needed, but I think there may have been one spot missing from these discussions.

That is, what about offline deployment? A common scenario (especially commercial software) is to package all necessary dependencies and Deno runtime into a single package, and let end user install it.
There are some concerns in these scenarios:

  1. the target machine is out of developer hands, and may not have internet. It is pretty common for business internal tools
  2. commercial software concerns about IP, if the deployed software needs to fetch some remote source, that could be a deal breaker

What I currently have, is something like pkg, which let me package all the source and resources into a single file (even it is not as perfect as a AOT compiled binary, it dose help some of the concerns).

Just my fifty cents.

from deno.

hayd avatar hayd commented on April 28, 2024

@Tundon in deno you can commit/bundle your DENO_DIR at build time, therefore you won't need a remote source at deployment (on the target machine).

from deno.

avegancafe avatar avegancafe commented on April 28, 2024

@hayd oh amazing, I didn’t know about importmap, that’s exactly what I want! Ahah. Thanks for the info! I’ll try out and use that :)

from deno.

Speykious avatar Speykious commented on April 28, 2024

I would say, if the only issue is that urls are harder to read than npm package names, then we can just simulate a package manager using something like an imports.ts file. And it would look something like this:

// imports for dependencies
export "<some url to pkg1>" as pkg1
export "<some url to pkg2>" as pkg2
export { obj1, class2, mod3 } from "<some url to pkg3>"

from deno.

tomitheninja avatar tomitheninja commented on April 28, 2024
import { lodash } from '../../../imports.ts'

from deno.

nayeemrmn avatar nayeemrmn commented on April 28, 2024

@ahmetcetin We can support existing npm packages, but they won't run because Deno had different bindings 🤷‍♂️

Maybe you're saying it should have the same bindings as Node as well? Use Node.

from deno.

ahmetcetin avatar ahmetcetin commented on April 28, 2024

@nayeemrmn that's what I do, I use Node indeed mostly just because knowing that I can find whatever I would look for, but I really would love to use deno in my projects. I know that deno has different bindings, I believe it would be really helpful if it would also support node bindings as well, but it's totally another discussion point.
My point is, if deno would support import the same way, like
import { differenceBy } from 'lodash'
then maintainers of the packages would support deno as well, and for us, who are using those packages would just use the packages the same way. All you need to do would be searching npm, and check if package supports deno. I'm fully aware of the problems of npm, but the importance of having registry is enormous imho.

Correct me if I'm wrong, if deno would be able to tap into npm registry, at least all packages written in pure javascript would be available to use in deno. And for the others, I believe it's much easier/desirable by the package maintainer to deal with one registry, even they need to deal with two separate bindings.

from deno.

ahmetcetin avatar ahmetcetin commented on April 28, 2024

i respect the design choices of deno, but the real issue is not having package manager or not, package manager is just a tool helping to organize the packages, but how importing the dependencies work. just to understand how the dependencies really work in deno: let's say i have a depA, which has dependency of depB which doesn't have any dependency to keep it simple. and the maintainer of depA also imported depB like so:

import { modB } from 'https://uri-of-depB'

In this case, if I import only depA:

import { modA } from 'https://uri-of-depA'

would it work properly in deno? If it works, imagine that somehow lot's of packages started to support deno in this way, and you're using depA which requires depB which requires depC, etc. Then what is the difference between referencing the dependency by url or by folder under node_modules? In both cases, if any of the piece of code breaks in dependency chain, your code will fail anyway

If it won't work in deno, don't you think it's a kind of problem? I mean, of course the best piece of software the one that you wrote and have control over it, but that's kinda against the reusability.

from deno.

Soremwar avatar Soremwar commented on April 28, 2024

@randfur You could if you want to, as long as the module uses JavaScript and not CommonJS for it's resolution system( it's the beauty of Deno )

A package manager can't be implemented cause it would limit the resources from which a JS module can be imported from(requires a metadata file, a standard to validate versions, a min runtime version, etc)

According to ECMA standards, any valid JS file is a module of sorts, and Deno's team seems to be of this same mind

from deno.

matiaslopezd avatar matiaslopezd commented on April 28, 2024

Maybe my last comment could transform in a business model 😆

from deno.

matiaslopezd avatar matiaslopezd commented on April 28, 2024

Also what can be happen with the most used packages when they don't have enough backers for host files and support all traffic with the URL model?. I mean think in lodash, tslib, axios, express, moment, etc. with the current developers demand.

I see the natural movement of companies to offer freemium/paid host to open source packages, that could end in a centralizated host.

I don't say will be a bad idea use URL but the developers need a secure model to import packages with the accessibility and security like fundamentals.

from deno.

crisnao2 avatar crisnao2 commented on April 28, 2024

I'm still think this way to import packages will be really insecure. Security is first! I mean, over 80% of devs today copy and paste the classic npm i package-name and never check the source code. That could be end on malwares, virus, troyan, etc. in your dev environment. Let's imagine all services that now provide analytics and builds of projects, executing and reading malicious code!

Other way to understand this, is with the known express package. For example, if I want import to Deno:

import { express } from 'https://malicious-cdn.com/libs/express/express.ts'

All it's ok, run perfect and offer the lastest express package... But the package have malicious code that recognize the dev or production environment and can log all HTTP request undetected by developers.

This is 100% possible with NPM, but we know that npm can remove it in Deno not. I see the difference between both easily, but how I can trust in that CDN?

  1. npm i express-2
  2. https://malicious-cdn.com/libs/express/4.17.1/express.ts

I know that Deno have other philosophy of security over runtime, but obviously if a package need write and read permissions the developers will give them.

What plans have about this? Is the cost of decentralization?

Packages over "Deno Proxy"

Could be a optional global package register, but not store the files? Maybe like a proxy for deno packages because with that flow Deno could give some security to developers and block malicious packages.

For example:

import { express } from 'https://malicious-cdn.com/libs/express/4.17.1/express.ts';

I'm still using malicious package but when start app, Deno proxy check the URL if is not in the blacklist packages.

I say optional because can be good idea turn off "Deno proxy" if the developer don't want use it, like with a privates repositories. Also could integrate with a list of trusted domains will not pass to the Deno proxy trusted-domains.json.

The idea of a proxy seems interesting, I will even travel a bit, maybe it could even be maintained by the NPM itself, something like, I want such a package, it would do something like import {module} from "npm.org?package=namespace/module.ts&package_version=1.&platform=deno&npm_package_version=", there at NPM they would get the package, all its subdependencies, package and deliver everything only in a single module file, from the moment he assembled the package the first time , it would already be cache for all the others, to keep updated, they could generate a new package every time there is an update in the main module or in its dependencies and create a versioning system for the packaged modules, like an npm_package_version.

from deno.

juanpenarandaosf avatar juanpenarandaosf commented on April 28, 2024

Beautiful! thank you

from deno.

balupton avatar balupton commented on April 28, 2024

I've created make-deno-edition to make npm packages written in typescript compatible with deno - is working on badges - usage proof of this here https://repl.it/@balupton/badges-deno - has been used now to make 32 node packages compatible with deno - you can use project to automatically generate the readme instructions for the deno edition - and can use boundation to automatically scaffold your projects to automate the entire process - start-of-week is an example where different entries are used for node, deno, and web browsers

from deno.

theoludwig avatar theoludwig commented on April 28, 2024

In the official website manual, they are using the deps.ts file solution. So what is the best solution, import maps or deps file ?

There shouldn't be 2 ways of doing the same thing as it is promoted by deno, for example the builtin utilities like fmt, lint or tests so everyone in deno use the same linter and formatter, no need to struggle with ESLint or Prettier configuration since everyone follows the same configurations and conventions.

I don't know if it's related to the issue (excuse me if it's not) : In Node.js in package.json there is a "scripts" object to easily execute long repetitive command, in deno it doesn't exist and it is all the more true with deno because of the security flags, so everytime you want to execute your deno script you must remember all the flags and options ?

Note : Since I'm mostly a Node.js developer, I'm new to the deno world, but really like some of the concepts promoted by deno and looking forwards to see the improvements. Excuse me if I'm saying something wrong.

from deno.

crowlKats avatar crowlKats commented on April 28, 2024

importmaps are a web standard, so deno implemented it. deps.ts is just a convetion people like to use. the reason people use deps.ts is because importmaps cant be used by modules, only by the final product.

from deno.

chickencoding123 avatar chickencoding123 commented on April 28, 2024

OK now I'm confused, why import maps and not standard typescript paths mapping? Specially since paths can be optionally placed in their own file using configuration inheritance? I must be missing something here 🤔

from deno.

lucacasonato avatar lucacasonato commented on April 28, 2024

Because import maps are a browser standard that also work for JS (not only typescript)

from deno.

chickencoding123 avatar chickencoding123 commented on April 28, 2024

@lucacasonato how about support for paths can be an additional feature when using typescript.

from deno.

Soremwar avatar Soremwar commented on April 28, 2024

@IRoninCoder TypeScript paths aren't meant to be used with URLs (which would require extra thinkering in the TS side to make it work like that) and it would complicate a lot more the module resolution system Deno uses, rather than just appending import maps to it.

Also, see #7732 which would effectively remove the capability to do that.

from deno.

kitsonk avatar kitsonk commented on April 28, 2024

paths are only to provide the TypeScript compiler for information of where code exists at runtime. Since We no manages both the compilation of TypeScript and where code is "located" at runtime, paths is a non-sensical config operation and is ignored.

from deno.

balupton avatar balupton commented on April 28, 2024

Regarding the above, to reiterate, https://github.com/bevry/make-deno-edition generates a compatible edition with all necessary path remapping automatically performed; for local deps, deno deps, and node_module deps.

from deno.

loohoo avatar loohoo commented on April 28, 2024

I am new to deno. after reading the deno manual and this artical my mind mess up. have no idea what's the best solution for dependence management...

from deno.

lucacasonato avatar lucacasonato commented on April 28, 2024

@loohoo You can use deps.ts and mod.ts without concern. That article and benchmark was written before we had incremental TypeScript compilation. It is not up to date anymore.

from deno.

azwar avatar azwar commented on April 28, 2024

NPM was easily abused, and Deno library is easier. See: https://www.veracode.com/blog/research/abusing-npm-libraries-data-exfiltration. We have to build both good package management and good repo central like Java and C# have.

from deno.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.