jcaddel / maven-s3-wagon Goto Github PK
View Code? Open in Web Editor NEWMulti-threaded wagon to connect Maven with Amazon S3
Multi-threaded wagon to connect Maven with Amazon S3
Just can't make this work.
I use this command:
mvn deploy -Daws.accessKeyId=<A_KEY> -Daws.secretKey=<A_SECRET_KEY> -X
and get back this:
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 400, AWS Service: Amazon S3, AWS Request ID: <...>, AWS Error Code: InvalidRequest, AWS Error Message: The authorization mechanism you have provided is not suppored. Please use AWS4-HMAC-SHA256., S3 Extended Request ID: <...>
this is a log of the headers received:
[DEBUG] >> "GET /?max-keys=0 HTTP/1.1[\r][\n]"
[DEBUG] >> "Host: <BUCKET_NAME>.s3.amazonaws.com[\r][\n]"
[DEBUG] >> "Authorization: AWS <KEY>[\n]"
[DEBUG] >> "User-Agent: aws-sdk-java/1.6.4 Mac_OS_X/10.8.5 Java_HotSpot(TM)_64-Bit_Server_VM/25.25-b02[\r][\n]"
[DEBUG] >> "Date: Thu, 18 Feb 2016 12:15:40 GMT[\r][\n]"
[DEBUG] >> "Content-Type: application/x-www-form-urlencoded; charset=utf-8[\r][\n]"
[DEBUG] >> "Connection: Keep-Alive[\r][\n]"
[DEBUG] >> "[\r][\n]"
[DEBUG] >> GET /?max-keys=0 HTTP/1.1
[DEBUG] >> Host: <BUCKET_NAME>.s3.amazonaws.com
[DEBUG] >> Authorization: AWS <KEY>
[DEBUG] >> User-Agent: aws-sdk-java/1.6.4 Mac_OS_X/10.8.5 Java_HotSpot(TM)_64-Bit_Server_VM/25.25-b02
[DEBUG] >> Date: Thu, 18 Feb 2016 12:15:40 GMT
[DEBUG] >> Content-Type: application/x-www-form-urlencoded; charset=utf-8
[DEBUG] >> Connection: Keep-Alive
[DEBUG] << "HTTP/1.1 400 Bad Request[\r][\n]"
[DEBUG] << "x-amz-bucket-region: eu-central-1[\r][\n]"
[DEBUG] << "x-amz-request-id: 097DD717AF3D5842[\r][\n]"
[DEBUG] << "x-amz-id-2: IJS/lkQ2UoDf0TIp4sQzXqgD0PfiRNUsVrZTty00EzOIvIBMTuTFZpmuXNlj8964DDfurfyBDN8=[\r][\n]"
[DEBUG] << "x-amz-region: eu-central-1[\r][\n]"
[DEBUG] << "Content-Type: application/xml[\r][\n]"
[DEBUG] << "Transfer-Encoding: chunked[\r][\n]"
[DEBUG] << "Date: Thu, 18 Feb 2016 12:15:41 GMT[\r][\n]"
[DEBUG] << "Connection: close[\r][\n]"
[DEBUG] << "Server: AmazonS3[\r][\n]"
[DEBUG] << "[\r][\n]"
[DEBUG] Receiving response: HTTP/1.1 400 Bad Request
[DEBUG] << HTTP/1.1 400 Bad Request
[DEBUG] << x-amz-bucket-region: eu-central-1
[DEBUG] << x-amz-request-id: 097DD717AF3D5842
[DEBUG] << x-amz-id-2: IJS/lkQ2UoDf0TIp4sQzXqgD0PfiRNUsVrZTty00EzOIvIBMTuTFZpmuXNlj8964DDfurfyBDN8=
[DEBUG] << x-amz-region: eu-central-1
[DEBUG] << Content-Type: application/xml
[DEBUG] << Transfer-Encoding: chunked
[DEBUG] << Date: Thu, 18 Feb 2016 12:15:41 GMT
[DEBUG] << Connection: close
[DEBUG] << Server: AmazonS3
[DEBUG] << "144[\r][\n]"
[DEBUG] << "<?xml version="1.0" encoding="UTF-8"?>[\n]"
[DEBUG] << "<Error><Code>InvalidRequest</Code><Message>The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.</Message><RequestId>097DD717AF3D5842</RequestId><HostId>IJS/lkQ2UoDf0TIp4sQzXqgD0PfiRNUsVrZTty00EzOIvIBMTuTFZpmuXNlj8964DDfurfyBDN8=</HostId></Error>"
Option 2: Using a Client-Side Master Key
http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingClientSideEncryption.html
One of the very common ways to authenticate with AWS is to use the aws command line tools to set up the contents of ~/.aws to have auth information in them. (The standard java S3 client supports this). It would be great if the s3 wagon also did!
I have two server buckets defined - one for the site and one for artifacts. Objects uploaded to the artifact bucket are created with the permission as defined in settings.xml, but objects sent to the site bucket always end up with public-read. From what I can see in the code, this may not be caused by maven-s3-wagon but an interaction with the site plugin. Can you advise on whether or not this is known (or expected) behaviour?
for some reason it seems that the sources and the javadoc though present in the bucket can not be downloaded.
The AWS java SDK introduced "EC2ContainerCredentialsProviderWrapper" and "ContainerCredentialsProvider" in, I think, 2016. Can these please be put into "MavenAwsCredentialsProviderChain.java" so that this extension can operate from within a docker image used by the AWS CodeBuild service?
Caused by: java.net.URISyntaxException: Illegal character in path at index 103: http://s3.amazonaws.com/some-bucket/some file with space.jar
at java.net.URI$Parser.fail(URI.java:2829)
at java.net.URI$Parser.checkChars(URI.java:3002)
at java.net.URI$Parser.parseHierarchical(URI.java:3086)
at java.net.URI$Parser.parse(URI.java:3034)
at java.net.URI.(URI.java:595)
at org.kuali.maven.wagon.S3Wagon.getNormalizedKey(S3Wagon.java:301)
... 28 more
Simple fix for getNormalizedKey() could be...
URI rawURI = new URI(urlString.replace(" ", "%20"));
seahen/maven-s3-wagon#29
I am trying to find a live fork.
Also if you know good github or gitlab organization where we could host this package it would be nice. There are no gurantee that next "live" fork will be stale in some years.
Credentials delivered through the Amazon EC2 container service if AWS_CONTAINER_CREDENTIALS_RELATIVE_URI" environment variable is set and security manager has permission to access the variable
#4 https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/auth/DefaultAWSCredentialsProviderChain.html
Hi,
The wagon fails to authenticate successfully when the S3 bucket is present in a region that doesn't support AWS authentication v2 (since AWS java SDK uses v2 by default).
However, it is possible to provide the system property com.amazonaws.services.s3.enableV4
to the sdk to make it switch to v4 authentication protocol. See https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/constant-values.html
For context, the auth failure message is:
Could not connect to repository: Status Code: 400, AWS Service: Amazon S3, AWS Request ID: XXXXXXXXXXX, AWS Error Code: InvalidRequest, AWS Error Message: The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256.
Is it currently possible to configure this system property in the wagon.
If not, I'd be happy to dig deeper and make a PR. I'm pretty new to the Java ecosystem so hopefully you'll bear with me :)
Hello Dev team,
We are working on an maven application which needs to talk to private repository which is on S3 and download all the needed jars in ./m2 repository.
We are using the following version of maven and S3 Wagon.
Maven 2.2.1
Wagon : org.Kuali.maven.wagons ->maven-s3-wagon->1.1.14
We also provided S3's access Key and secret key in settings.xml.
We get the following Transfer error, but if we physically create the directory in .m2 repo, we are able to download the needed jars from Private S3.
Can you please help us in resolving this issue and let us know if there is any S3 Wagon compatible version for Maven 2.2.1?
Your help is much appreciated. Thank you in advance.
directory C:\kuali-app\kuali-app\src\main\resources
excludes []
includes []
[INFO] skip non existing resourceDirectory C:\kuali-app\kuali-app\src\main\resources
[DEBUG] org.sb.kuali:kuali-app:jar:0.0.1-SNAPSHOT (selected for null)
[DEBUG] junit:junit:jar:3.8.1:test (selected for test)
[INFO] snapshot org.sb.kuali:kuali-test:0.0.1-SNAPSHOT: checking for updates from maven-s3-repo
[DEBUG] Connecting to repository: 'maven-s3-repo' with url: 's3://kuali-test'.
[INFO] Logged in - kuali-test
[INFO] Downloading: s3://kuali-test/org/sb/kuali/kuali-test/0.0.1-SNAPSHOT/maven-metadata.xml
[ERROR] Transfer error: java.io.FileNotFoundException: C:\Users\Reshma.m2\repository\org\sb\kuali\kuali-test\0.0.1-SNAPSHOT\maven-metadata-maven-s3-repo.xml.tmp (The system cannot find the path specified)
java.io.FileNotFoundException: C:\Users\Reshma.m2\repository\org\sb\kuali\kuali-test\0.0.1-SNAPSHOT\maven-metadata-maven-s3-repo.xml.tmp (The system cannot find the path specified)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.(FileOutputStream.java:221)
at java.io.FileOutputStream.(FileOutputStream.java:171)
at org.kuali.maven.wagon.TransferProgressFileOutputStream.(TransferProgressFileOutputStream.java:35)
at org.kuali.maven.wagon.S3Wagon.getResource(S3Wagon.java:189)
at org.kuali.maven.wagon.AbstractWagon.get(AbstractWagon.java:178)
at org.apache.maven.artifact.manager.DefaultWagonManager.getRemoteFile(DefaultWagonManager.java:546)
at org.apache.maven.artifact.manager.DefaultWagonManager.getArtifactMetadata(DefaultWagonManager.java:443)
at org.apache.maven.artifact.repository.metadata.DefaultRepositoryMetadataManager.resolve(DefaultRepositoryMetadataManager.java:97)
at org.apache.maven.artifact.transform.AbstractVersionTransformation.resolveVersion(AbstractVersionTransformation.java:65)
at org.apache.maven.artifact.transform.SnapshotTransformation.transformForResolve(SnapshotTransformation.java:63)
at org.apache.maven.artifact.transform.DefaultArtifactTransformationManager.transformForResolve(DefaultArtifactTransformationManager.java:55)
at org.apache.maven.artifact.resolver.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:145)
at org.apache.maven.artifact.resolver.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:90)
at org.apache.maven.project.DefaultMavenProjectBuilder.findModelFromRepository(DefaultMavenProjectBuilder.java:558)
at org.apache.maven.project.DefaultMavenProjectBuilder.buildFromRepository(DefaultMavenProjectBuilder.java:251)
at org.apache.maven.project.artifact.MavenMetadataSource.retrieveRelocatedProject(MavenMetadataSource.java:163)
at org.apache.maven.project.artifact.MavenMetadataSource.retrieveRelocatedArtifact(MavenMetadataSource.java:94)
at org.apache.maven.artifact.resolver.DefaultArtifactCollector.recurse(DefaultArtifactCollector.java:387)
at org.apache.maven.artifact.resolver.DefaultArtifactCollector.collect(DefaultArtifactCollector.java:74)
at org.apache.maven.artifact.resolver.DefaultArtifactResolver.resolveTransitively(DefaultArtifactResolver.java:316)
at org.apache.maven.artifact.resolver.DefaultArtifactResolver.resolveTransitively(DefaultArtifactResolver.java:304)
at org.apache.maven.plugin.DefaultPluginManager.resolveTransitiveDependencies(DefaultPluginManager.java:1499)
at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:442)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:694)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalWithLifecycle(DefaultLifecycleExecutor.java:556)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:535)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:387)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:348)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:180)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:362)
at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:60)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315)
at org.codehaus.classworlds.Launcher.launch(Launcher.java:255)
at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430)
at org.codehaus.classworlds.Launcher.main(Launcher.java:375)
[INFO] Logged off - kuali-test
[INFO] Transfers: 1 Time: 435ms Amount: 0.0k Throughput: 0.0 kB/s
[WARNING] repository metadata for: 'snapshot org.sb.kuali:kuali-test:0.0.1-SNAPSHOT' could not be retrieved from repository: maven-s3-repo due to an error: Transfer of resource C:\Users\Reshma.m2\repository\org\sb\kuali\kuali-test\0.0.1-SNAPSHOT\maven-metadata-maven-s3-repo.xml.tmpfailed
[DEBUG] Exception
org.apache.maven.wagon.TransferFailedException: Transfer of resource C:\Users\Reshma.m2\repository\org\sb\kuali\kuali-test\0.0.1-SNAPSHOT\maven-metadata-maven-s3-repo.xml.tmpfailed
at org.kuali.maven.wagon.AbstractWagon.get(AbstractWagon.java:189)
at org.apache.maven.artifact.manager.DefaultWagonManager.getRemoteFile(DefaultWagonManager.java:546)
at org.apache.maven.artifact.manager.DefaultWagonManager.getArtifactMetadata(DefaultWagonManager.java:443)
at org.apache.maven.artifact.repository.metadata.DefaultRepositoryMetadataManager.resolve(DefaultRepositoryMetadataManager.java:97)
at org.apache.maven.artifact.transform.AbstractVersionTransformation.resolveVersion(AbstractVersionTransformation.java:65)
at org.apache.maven.artifact.transform.SnapshotTransformation.transformForResolve(SnapshotTransformation.java:63)
POM File
4.0.0
org.sb.kuali
kuali-app
0.0.1-SNAPSHOT
jar
kuali-app
http://maven.apache.org
<repositories>
<repository>
<id>maven-s3-repo</id>
<name>S3 Release Repository</name>
<url>s3://kuali-test</url>
</repository>
</repositories>
<build>
<extensions>
<extension>
<!-- <groupId>org.springframework.build</groupId>
<artifactId>aws-maven</artifactId>
<version>5.0.0.RELEASE</version> -->
<groupId>org.kuali.maven.wagons</groupId>
<artifactId>maven-s3-wagon</artifactId>
<version>1.1.14</version>
</extension>
</extensions>
</build>
I dont know why by acl settings are ignored in my setttings.xml file. Although clearly it must be user as the username and password is.
Ive tried various things and maven --debug does not print the acl debug in the code.
Any tips ?
Settings.xml snippet
<server>
<id>awsaccount</id>
<configuration>
<acl>Private</acl>
</configuration>
<username>AXXXXXXXXXXXXXXXA</username>
<password>XXXXXXXXXXXXXXXXXXXXXXXX</password>
</server>
pom.xml snippet
<extensions>
<!-- Deploy to S3 extension -->
<extension>
<groupId>org.kuali.maven.wagons</groupId>
<artifactId>maven-s3-wagon</artifactId>
<version>1.1.10</version>
</extension>
</extensions>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>wagon-maven-plugin</artifactId>
<version>1.0-beta-3</version>
<executions>
<execution>
<id>awsaccount</id>
<phase>deploy</phase>
<goals>
<goal>upload</goal>
</goals>
<configuration>
<serverId>awsaccount</serverId>
<includes>**</includes>
<excludes>pom.xml</excludes>
<fromDir>src/main/resources</fromDir>
<url>s3://MYBUCKET/simontest</url>
</configuration>
</execution>
</executions>
</plugin>
Hi,
I am using the s3-wagon thorugh gradle and the gradle maven plugin. Initially I deployed from my machine, but then moved the deployment to traviCI, were I encountered a stacktrace.
Transfer error: java.io.FileNotFoundException: /home/travis/.m2/repository/com/fidesmo/gradle-fidesmo/0.1-SNAPSHOT/maven-metadata-remote.xml.tmp (No such file or directory)
java.io.FileNotFoundException: /home/travis/.m2/repository/com/fidesmo/gradle-fidesmo/0.1-SNAPSHOT/maven-metadata-remote.xml.tmp (No such file or directory)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
at java.io.FileOutputStream.<init>(FileOutputStream.java:171)
at org.kuali.maven.wagon.TransferProgressFileOutputStream.<init>(TransferProgressFileOutputStream.java:35)
at org.kuali.maven.wagon.S3Wagon.getResource(S3Wagon.java:242)
at org.kuali.maven.wagon.AbstractWagon.get(AbstractWagon.java:171)
at org.apache.maven.artifact.manager.DefaultWagonManager.getRemoteFile(DefaultWagonManager.java:546)
at org.apache.maven.artifact.manager.DefaultWagonManager.getArtifactMetadataFromDeploymentRepository(DefaultWagonManager.java:452)
at org.apache.maven.artifact.repository.metadata.DefaultRepositoryMetadataManager.getArtifactMetadataFromDeploymentRepository(DefaultRepositoryMetadataManager.java:379)
at org.apache.maven.artifact.repository.metadata.DefaultRepositoryMetadataManager.resolveAlways(DefaultRepositoryMetadataManager.java:347)
at
Which I would have expected since I never used the local maven repository while deploying. This issue only arises if there this was not the first deployment of the artifact and was reproducible from my workstation, when removing the ~/.m2
directory.
Since I am quite new to the eco system, I don't know whether this is an issue of the wagon or the maven plugin for gradle. So I filed the bug here. My current quick fix is to first run gradle install
which will deploy to the local repository.
Hello,
I use the shade plugin in order to create a fat, single executable jar with a specific name. This jar contains all my application and has a specific name. Is there a way to use the maven s3 plugin to upload just this jar? I don't mind uploading the md5 or sha1 files too but it seems that the jar generated by the shade plugin is not used at all.
QAlso, is there a way to upload to a specific folder in the S3 bucket and not one created using the package paths?
I have an IAM user created for Maven to upload the site files into an S3 bucket. When I use a policy for a IAM user like:
{
"Statement": [
{
"Sid": "Stmt1234567",
"Action": ["s3:"
],
"Effect": "Allow",
"Resource": ""
}
]
}
The upload works. When I use:
{
"Statement": [
{
"Sid": "Stmt13722196541",
"Action": ["s3:*"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::bucket_name"
}
]
}
The upload fails. What are the permissions needed on the S3 side (user or bucket policy) to allow the plugin to only have access to one of the S3 buckets? I know it's probably because it needs to be able to list all the buckets to find the one it's using, but I've been unable to find the right combination of policies that will allow the plugin to run.
Could this be added to the wiki pages? Thanks.
I've used various version of this plugin and they all exhibit the same occassion
al problem. Uploads to s3 can fail with the following exception;
j
ava.io.IOException: Input stream cannot be reset as 18290688 bytes have been written, exceeding the available buffer size of 131072
After some investigation it seems the error is caused by TransferProgressFileInputStream not really being repeatable. Should the upload fail, the retry can fail
as the buffer is not properly set up.
One solution should be to make TransferProgressFileInputStream extend the aws Re
peatableFileInputStream instead.
hi;
do you think it is feasible to add support for full range of permissions?
currently you have only "public"
request.setCannedAcl(CannedAccessControlList.PublicRead);
see
com.amazonaws.services.s3.model.CannedAccessControlList
thanks.
/**
* Create a PutObjectRequest based on the source file and destination passed in
*/
protected PutObjectRequest getPutObjectRequest(File source, String destination, TransferProgress progress) {
try {
String key = getNormalizedKey(source, destination);
String bucketName = bucket.getName();
InputStream input = getInputStream(source, progress);
ObjectMetadata metadata = getObjectMetadata(source, destination);
PutObjectRequest request = new PutObjectRequest(bucketName, key, input, metadata);
request.setCannedAcl(CannedAccessControlList.PublicRead);
return request;
} catch (FileNotFoundException e) {
throw new AmazonServiceException("File not found", e);
}
}
Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.8.2:deploy (default-deploy) on project docker-demo: Failed to retrieve remote metadata com.example:docker-demo:0.0.1-SNAPSHOT/maven-metadata.xml: Could not transfer metadata com.example:docker-demo:0.0.1-SNAPSHOT/maven-metadata.xml from/to snapshot-maven (s3://snapshot-maven/snapshot): 'us-east-2' is not a valid location constraint -> [Help 1]
this was the error while i executed mvn clean deploy
It seems this s3 wagon doesn't support HTTPS proxy server. At least it doesn't work when using with standard -Dhttps.proxyHost/Port. Environment variable HTTPS_PROXY also doesn't help
Hi,
I followed these instruction - https://github.com/jcaddel/maven-s3-wagon/wiki/Usage
I am getting pretty low throughput while uploading file to S3.
Below are the figures -
[INFO] Transfers: 3 Time: 7.556m Amount: 15.0m Throughput: 33.942 KB/s
While I am uploading the same file through AWS S3 console, I am able to upload the file in max of 10 seconds.
Googling didnt help.
Please advise if I tweak some configuration to speed it up.
Since it uses Signature V2 authentication, AWS is deprecating it for all of S3 on June 24, 2019.
Currently only a few regions support the plugin, regions launched before 2014.
com.amazonaws:aws-java-sdk:1.3.8
, which you depend on, already contains this mime.types
file. Why are you duplicating it?
I add the 1.2.1 maven-s3-wagon to the pom file as the follwoing
In the ~/.m2/settings.xml, I add the filePermissions as the following.
I hope only the authorized users will have read access, but the uploaded artifacts seem have PublicRead permission. Is there something wrong with my setting?
Any ideas is appreciated !!!
Example for file permission configuration in settings.xml should read :-
<server>
<id>s3.snapshot</id>
<filePermissions>AuthenticatedRead</filePermissions>
</server>
At least in my tests here using maven 3.0.4
Where the user credentials are defined within one account and a cross account role provides the permissions to write to the S3 bucket there is a need to use the STSAssumeRoleSessionCredentialsProvider.
This issue would relate to supporting ~/.aws/credentials.
As far as I understand, at the moment, the plugin doesn't support any other S3 endpoints except s3
, the default one. My bucket is in Singapour and its endpoint is s3-ap-southeast-1
. Would be nice to make it possible to configure an end point. Maybe in settings.xml
inside server
element (http://maven.apache.org/settings.html#Servers):
<servers>
<server>
<id>server001</id>
<username>my_login</username>
<password>my_password</password>
<configuration>
<endpoint>s3-ap-southeast-1.amazonaws.com</endpoint>
</configuration>
</server>
</servers>
Newer versions of EnvironmentVariableCredentialsProvider
support AWS_SESSION_TOKEN
.
Could you please upgrade the aws.version
?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.