Git Product home page Git Product logo

codespeed's Introduction

Codespeed

Build Status PyPI version

Codespeed is a web application to monitor and analyze the performance of your code.

Known to be used by CPython, PyPy, Twisted and others.

For an overview of some application concepts see the wiki page

Installation

You will need Python 2.7 or 3.5+.

To install dependencies and the codespeed Django app:

pip install codespeed

If you want version control integration, there are additional requirements:

  • Subversion needs pysvn: python-svn
  • Mercurial needs the package mercurial to clone the repo locally
  • git needs the git package to clone the repo
  • For Github the isodate package is required, but not git: pip install isodate

Note: For git or mercurial repos, the first time the changes view is accessed, Codespeed will try to clone the repo, which depending on the size of the project can take a long time. Please be patient.

  • Download the last stable release from github.com/tobami/codespeed/tags, unpack it and install it with python setup.py install.

  • To get started, you can use the sample_project directory as a starting point for your Django project, which can be normally configured by editing sample_project/settings.py.

  • For simplicity, you can use the default sqlite configuration, which will save the data to a database named data.db

  • Create the DB by typing from the root directory:

      python manage.py migrate
    
  • Create an admin user:

      python manage.py createsuperuser
    
  • For testing purposes, you can now start the development server:

      python manage.py runserver 8000
    

The codespeed installation can now be accessed by navigating to http://localhost:8000/.

Note: for production, you should configure a real server like Apache or nginx (refer to the Django docs). You should also modify sample_project/settings.py and set DEBUG = False. sample_project/README.md also describes some production settings.

Codespeed configuration

Using the provided test data

If you want to test drive Codespeed, you can use the testdata.json fixtures to have a working data set to browse.

  • From the root directory, type:

      ./manage.py loaddata codespeed/fixtures/testdata.json
    

Starting from scratch

Before you can start saving (and displaying) data, you need to first create an environment and define a default project.

  • Go to http://localhost:8000/admin/codespeed/environment/ and create an environment.
  • Go to http://localhost:8000/admin/codespeed/project/ and create a project.

Check the field "Track changes" and, in case you want version control integration, configure the relevant fields.

Note: Only executables associated to projects with a checked "track changes" field will be shown in the Changes and Timeline views.

Note: Git and Mercurial need to locally clone the repository. That means that your sample_project/repos directory will need to be owned by the server. In the case of a typical Apache installation, you'll need to type sudo chown www-data:www-data sample_project/repos

Saving data

Data is saved POSTing to http://localhost:8000/result/add/.

You can use the script tools/save_single_result.py as a guide. When saving large quantities of data, it is recommended to use the JSON API instead: http://localhost:8000/result/add/json/

An example script is located at tools/save_multiple_results.py

Note: If the given executable, benchmark, project, or revision do not yet exist, they will be automatically created, together with the actual result entry. The only model which won't be created automatically is the environment. It must always exist or the data won't be saved (that is the reason it is described as a necessary step in the previous "Codespeed configuration" section).

Further customization

Custom Settings

You may override any of the default settings by setting them in sample_project/settings.py. It is strongly recommended that you only override the settings you need by importing the default settings and replacing only the values needed for your customizations:

from codespeed.settings import *

DEF_ENVIRONMENT = "Dual Core 64 bits"

Site-wide Changes

All pages inherit from the base.html template. To change every page on the site simply edit (sample_project/templates/codespeed/base_site.html) and override the appropriate block:

  • Custom title: you may replace the default "My Speed Center" for the title block with your prefered value:

      {% block title %}
          My Project's Speed Center
      {% endblock %}
    
  • Replacing logo.png: Place your logo in sample_project/static/images/logo.png

  • Logo with custom filename: Place your logo in sample_project/static/images/ and add a block like this to base_site.html:

      {% block logo %}
          <img src="{{ MEDIA_URL }}images/my-logo.jpg" width="120" height="48" alt="My Project">
      {% endblock logo %}
    

    n.b. the layout will stay exactly the same for any image with a height of 48px (any width will do)

  • Custom JavaScript or CSS: add your files to the sample_project/static/js directory and extend the extra_head template block:

      {% block extra_head %}
          {{ block.super }}
          <script type="text/javascript" src="{{ MEDIA_URL }}static/js/my_cool_tweaks.js">
      {% endblock extra_head %}
    

Specific Pages

Since sample_project/templates/codespeed is the first entry in settings.TEMPLATE_DIRS you may override any template on the site simply by creating a new one with the same name.

  • About page: create sample_project/templates/about.html:

      {% extends "codespeed/base_site.html" %}
      {% block title %}{{ block.super }}: About this project{% endblock %}
      {% block body %}
          <div id="sidebar"></div>
          <div id="about" class="about_content clearfix">
              Your content here
          </div>
      {% endblock %}
    

Baselines and Comparison view executables

  • The results associated to an executable and a revision which has a non blank tag field will be listed as a baseline option in the Timeline view.
  • Additionally, the Comparison view will show the results of the latest revision of projects being tracked as an executable as well.

Defaults

The file sample_project/settings.py can contain customizations of several parameters (the file includes comments with full examples).

General settings

  • WEBSITE_NAME: The RSS results feed will use this parameter as the site name
  • DEF_BASELINE: Defines which baseline option will be chosen as default in the Timeline and Changes views.
  • DEF_ENVIRONMENT: Defines which environment should be selected as default in the Changes and Timeline views.
  • CHANGE_THRESHOLD
  • TREND_THRESHOLD

Home Page

The main customization for the homepage is to display either the reports (daily changes) or the historical graphs (improvement over time).

  • SHOW_REPORTS: If set to True displays a table and RSS feed with the latest results

  • SHOW_HISTORICAL: Displays two graphs comparing tagged and latest versions of the default executable and project to the configured baseline.

    To activate the historical graphs you will need to set the following settings (using values that work with the test data):

    • DEF_BASELINE = {'executable': 'baseExe', 'revision': '444'}
    • DEF_EXECUTABLE = "myexe O3 64bits"

Changes View

  • DEF_EXECUTABLE: in the Changes view, a random executable is chosen as default. It that doesn't suite you, you can specify here which one should be selected. You need to specify its id (since the name alone is not unique).

Timeline View

  • DEF_BENCHMARK: Defines the default timeline view. Possible values:
    • None: will show a grid of plot thumbnails, or a text message when the number of plots exceeds 30
    • grid: will always show as default the grid of plots
    • show_none: will show a text message (better default when there are lots of benchmarks)
    • mybench: will select benchmark named "mybench"

Comparison View

  • CHART_TYPE: Chooses the default chart type (normal bars, stacked bars or relative bars)
  • NORMALIZATION: Defines whether normalization should be enabled as default in the Comparison view.
  • CHART_ORIENTATION: horizontal or vertical
  • COMP_EXECUTABLES: per default all executables will be checked. When there are a large number of tags or executables, it is better to only select a few so that the plots are not too cluttered. Given as a list of tuples containing the name of an executable + commitid of a revision. An 'L' denotes the last revision. Example:
COMP_EXECUTABLES = [
    ('myexe', '21df2423ra'),
    ('myexe', 'L'),
]
  • `COMPARISON_COMMIT_TAGS: Defines a list of tags to display on the comparison page. This comes handy when there are a lot of tags. It defaults to None which means display all the available tags.

VCS Provider Specific Settings

Github

  • GITHUB_OAUTH_TOKEN - Github oAuth token to use for authenticating against the Github API. If not provided, it will default to unauthenticated API requests which have low rate limits so an exception may be thrown when retrieving info from the Github API due to the rate limit being reached.

Getting help

For help regarding the configuration of Codespeed, or to share any ideas or suggestions you may have, please post on Codespeed's discussion group

Reporting bugs

If you find any bug in Codespeed please report it on the Github issue tracker

codespeed's People

Contributors

0xflotus avatar a8 avatar acdha avatar aleksi avatar catalin-manciu avatar chriscool avatar ctk21 avatar cykl avatar emil-g avatar exarkun avatar iffy avatar javierhonduco avatar kami avatar kmod avatar kupsef avatar lra avatar michaelnt avatar mwatts15 avatar nomeata avatar octmoraru avatar pared avatar pepsiman avatar philangist avatar quasilyte avatar smarr avatar squiddy avatar staticfloat avatar str4d avatar thedavecollins avatar tobami avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

codespeed's Issues

Missing migration for Revision.project null=True

I was going to work on #35, but when I wanted to create a new migration, south found another change:

$ python manage.py schemamigration codespeed --auto
 ? The field 'Revision.project' does not have a default specified, yet is NOT NULL.
 ? Since you are making this field nullable, you MUST specify a default
 ? value to use for existing rows. Would you like to:
 ?  1. Quit now, and add a default to the field in models.py
 ?  2. Specify a one-off value to use for existing columns now
 ?  3. Disable the backwards migration by raising an exception.

In 6cefd8e null=True was added to the definition of Revision.project, but apparently no migration was added.

If I'm not mistaken, each revision has a link to a project (a revision belongs to a branch, and that branch belongs to a project). Why can project be left out on a revision then?

Static URL in codespeed.js

The URL embedded in the js file is not correct when static files are served from another base URL.
The offener is the img tag on line 27:
https://github.com/tobami/codespeed/blob/master/codespeed/static/js/codespeed.js#L27

Is there a way to adapt that URL automatically?

Adding a {{ STATIC_URL }} does not help obviously since the file is served directly without being processed.
I did not see anything in the django manual which would allow preprocessing during ./manage.py collectstatic either.
That would have been nice.

Are there any other options then serving codespeed.js as a template which gets preprocessed?

Thanks
Stefan

Result reports

There should be a report for each revision, that summarizes significant changes in the results. This reports can then be shown in the home page, sent as email alerts and added to a RSS feed (email alerts and RSS belong to another issue)

Switch to HTML5

With the merge of a couple of JS improvements, a commit ebf9873 was merged that makes use HTML5's custom data attributes. My question in the pull request was probably not visible enough, so I'm asking here. Using custom data attributes is only valid in HTML5 (or you could provide a custom DTD, but no one does that I think).

In base.html a comment suggests that a switch to HTML5 was considered: https://github.com/tobami/codespeed/blob/master/example/templates/base.html#L18

I don't have any particular feature in mind that needs HTML5 (except custom data attributes), but I don't see anything wrong with upgrading.

Page navigation support for timeline.

Hi:

Would be great if the timeline could support page navigation, i.e., the browser's back-button.
At the moment, you always return to the overview instead of a specific benchmark.

Thanks
Stefan

Possible regression in 0.8.x upon POST /result/add/

The sample script save_single_result.py throws the following exception in versions 0.8.x of codespeed:

WARNING:root:unable to save revision Jul 07, 02:17 - 14 info: updaterepo() takes no arguments (1 given)
Traceback (most recent call last):
File "/work/codespeed/speedcenter/../speedcenter/codespeed/views.py", line 797, in save_result
saverevisioninfo(rev)
File "/work/codespeed/speedcenter/../speedcenter/codespeed/views.py", line 709, in saverevisioninfo
log = getcommitlogs(rev, rev, update=True)
File "/work/codespeed/speedcenter/../speedcenter/codespeed/views.py", line 698, in getcommitlogs
updaterepo(rev.branch.project)
TypeError: updaterepo() takes no arguments (1 given)
...
[07/Jul/2011 02:17:07] "POST /result/add/ HTTP/1.1" 500 122371

This seems to be a regression; this worked fine in versions 0.6.x and 0.7.0.

overwrite seems not to be the Django way

Hi,

First of all thank you for this great piece of software. It's fun using it.

Compared to other Django projects I find your template overwrite mechanism a bit unusual. The Django documentation describes the TEMPLATE_DIRS variable in settings.py to configure where the template loader is looking for templates.
http://www.djangobook.com/en/beta/chapter04/

Wouldn't it be better to add something like

TEMPLATE_DIRS = (
os.path.join(os.path.dirname(file), 'overwrite_templates').replace('','/'),
os.path.join(os.path.dirname(file), 'templates').replace('','/'),
)

to the settings.py?

Strictly speaking in Django words: speedcenter is the project and codespeed the app. Maybe it would be best to separate the two.

Thanks,

Frank

Multiple Project UI

Currently the front-end provides no way to select which project to view, which makes it hard to use as a multi-project service.

If this seems reasonable I'm willing to add the proposed changes below:

  1. Add a slug field to Project
  2. Move all URLs to be prefixed with the project slug
  3. Have / redirect if there's only one project (maybe controlled by a flag?) or display a list otherwise

New view with a history of regressions

Users should be able to see a list of past regressions, to have a history of changesets that adversely affected performance.

Further, regressions should be annotated somehow. Either letting users write notes, visible to everyone, or automatically creating an issue at the project's bug trucker (Roundup for pypy)

timeline view fails if no benchmark is selected

After creating a new codespeed instance with an environment and a project and posting a result to it using tools/save_single_result.py (after fixing the KeyError problem), loading the timeline view hangs indefinitely because the ajax request for data fails with a 500. This isn't handled and the spinner just keeps spinning.

The request is made for something like:

/timeline/json/?_=1282581197452&exe=1&base=none&env=boson&revs=200

Note the missing "ben" key. This is because no benchmark is selected and the "ben" property of the configuration object is therefore "undefined". This corresponds to the radio input for benchmark selection which has just one choice, "float" (as per the default values in the save_single_result.py tool) which is not selected.

Selecting the "float" benchmark causes another request to be made which succeeds and the timeline renders properly.

Unlike the timeline on speed.pypy.org, I also see no "Display all in a grid" radio choice on a newly created codespeed instance.

Permalinks for Environments should use id instead of name

As the new Codespeed site shows, the environment name is used for the permalink:
http://speed.rubyspec.org/changes/?tre=10&rev=268a0200d16b42f5201ba6c9faa2cd84447a6487&exe=2&env=EC2+(OpenJDK)

Clicking on a Report row (front page) will even take you to a permalink with no url encoding but plain text:
env=EC2 (OpenJDK6)

It would be better to use an id. That would also allow to modify the name of an environment and not break permalinks.

The revision can also be changed to id, to reduce the total URL length

Allow to set benchmark properties when saving data

When trying to save data for a benchmark that doesn't exist, it automatically gets created with default values. There should be a way to allow to set 'units', 'units_title' and 'lessisbetter' programmatically.

enhancement - remove items from legend that have no data

In the Comparisons tab, sometimes reading the chart is difficult when there are a lot of entries in the legend. It's hard to tell the difference between all the different shades of green and mapping them back to the bars.

So, I have a suggestion for improving the display.

If an executable/environment pair has no data point available, remove it from the legend and chart. This will rescale the data that is displayed so that it is larger and easier to read. Right now those "no data" spots cause the chart to rescale and is hard to read.

Use History API instead of explicit permalink button

Disclaimer: I haven't checked yet whether jQuery Address also provides this feature.

Right now, the jQuery Address plugin is used to update the URL when the permalink is clicked. IMO this has some disadvantages:

  • change of URL triggers a (unnecessary) page reload
  • hashtag urls (as seen in the timeline view)

The History API allows to modify the browsing history, e.g. pushing new pages are replacing the current one. Some advantages:

  • no page reload
  • no (ugly) hashtag URLs

Using the API for codespeed we can:

  • Remove the permalink button (the URL will be updated on every single change)
  • Users can copy the URL anytime and get the same view again

Problems

Of course, browser support: http://caniuse.com/#search=history

Apparently no current version of IE supports the API. But it might be possible to fallback on the old behaviour, that is showing a permalink button for IE users.

https://github.com/balupton/history.js looks promising, History API for HTML5 Browsers with fallback to hashtag urls.

Demo

https://github.com/squiddy/codespeed/tree/history_demo
Look at the comparison view and you'll (hopefully) see what I'm aiming for.

http://html5demos.com/history/

Add security measures for POSTing

Currently anyone can POST data to a Codespeed instance. Some kind of security is needed. Either proper authentication or a simpler secret key.

Required Django version incorrect

According to the readme, at least version 1.1 of Django is required to use codespeed. However, at least 2 features from 1.2 are used inside the code: syndication views and model validation.
The example project makes use of django.contrib.staticfiles, which was introduced in Django 1.3. Codespeed's templates are using STATIC_URL, so this is not going to work in older django versions right away.

Is supporting django versions < 1.3 a goal? If it is, maybe some instructions to use codespeed with these versions are needed, otherwise the readme should probably be updated.

tools/save_single_result.py fails with an unhandled KeyError

After setting up a new codespeed instance (following the directions in README.md), I tried running save_single_result.py to see what it would do to the running codespeed instance. However, it failed with this traceback (sorry, I can't figure out how to do any formatting in this text area):

exarkun@boson:/Scratch/Sources/tobami-codespeed-f2b86bc$ python tools/save_single_result.py
Traceback (most recent call last):
File "tools/save_single_result.py", line 38, in
add(data)
File "tools/save_single_result.py", line 31, in add
print "Executable %s, revision %s, benchmark %s" % (data['executable_name'], data['commitid'], data['benchmark'])
KeyError: 'executable_name'
exarkun@boson:
/Scratch/Sources/tobami-codespeed-f2b86bc$

ZeroDivisionError at /changes/table/

When there is a zero result added, loading the Changes tab fails:

/Users/codespeed/Scratch/Source/codespeed/speedcenter/codespeed/views.py in getchangestable
464. change = (result - c[0].value)*100/c[0].value

Add a regressions page

Essentially this would be a page that shows any benchmarks which are statically significantly worse than the best ever time for them on a given VM. Currently there is no good way to check for regressions besides looking at all the graphs. THanks!

Default branch hardcoded, migration seems to have failed in that regard

Hi:

Not sure whether I did everything correct, but I think I followed the migration instructions closely.
However, I ended up with a database which is seemingly correct, but had the branch attribute (I think on revisions) set to the empty string.

Now, I changed that to 'master' since we are using git, and to avoid confusion.

But, that gives problems with at least the changes view.
See: http://soft.vub.ac.be/~ppp/codespeed/changes/

The reason is that the 'default' is hardcoded.
I am working on at least a partial fix, and will probably send a pull requests, but wanted to mention it, because I am not sure when I will get to it.

Best regards
Stefan

Instances with many benchmarks are slow to process POST to /result/add/

In my system I have 400+ unique benchmarks. Every time a new result is POSTed, the application does a select against the codespeed_result table for every benchmark_id (over 400 select calls). It appears it is doing this to collect all results to update the codespeed_reports table.

This doesn't scale. As more and more benchmarks are added this is just going to get slower and slower.

One potential fix is to replace the 400+ selects with a single call to the DB to pull all of these results into memory at once for processing. This has some implications for overall memory footprint, but like anything it is a trade-off (space versus time).

Problems with benchmark names on timeline

Not sure what the cause is, but I did not have problems with names like 'BinaryTrees (1 cores, 1 1 6)' before.
In the current version that causes a JavaScript error on the timeline: Uncaught Syntax error, unrecognized expression: [value=BinaryTrees (1 cores, 1 1 6)] jquery-1.6.2.min.js:17

See:
http://soft.vub.ac.be/~ppp/codespeed/timeline/

Hadn't had the chance to investigate more closely yet.
Will hopefully come back to it later.

Improve baseline displaying

The current hack where a marker-less series with two points is used for the baseline can be reimplemented the proper way: using a horizontal line instead. That is supported by jqPlot version 1.0 (canvasOverlay)

Also, jqPlot has a new feature that allows to show error messages (no data, for example): catchError.html

Require Result fields

http://speed.twistedmatrix.com/timeline/ broke because we inserted some Results with empty date fields. Speedcenter doesn't care and feeds them to jqplot, resulting in: Uncaught TypeError: Object NaN has no method 'getTime' when rendering a graph (and consequently no graph).

Fields that break jqplot when empty should either be required on insert, or checked as non-null before being provided to jqplot. The latter might be preferable since it doesn't require a database alteration.

Comparison y-axis label wrong

PyPy Speed Center comparison chart shows the performance of different versions of PyPy. The Y-Axis says: "Ratio (less is better)", but the vertical bars are taller for more recent versions of PyPy and PyPy is getting faster. Either the bar are labeled wrong or more is better.

http://speed.pypy.org/comparison/?exe=1%2B41%2C1%2B172%2C1%2BL&ben=1%2C27%2C2%2C25%2C3%2C4%2C5%2C22%2C6%2C7%2C8%2C23%2C24%2C9%2C10%2C11%2C12%2C13%2C14%2C15%2C16%2C17%2C18%2C19%2C20%2C26&env=1&hor=false&bas=2%2B35&chart=normal+bars

Don't show revisions that don't have any data for a particular exe

In the Changes view, the revision list (combo box) updates based on which project it is selected. So executables inside the same project share the same revision list. That can lead to having a selected revision, selecting another executable, and getting an empty chantes table.

Users shouldn't have to guess which rev/exe combination has data and which not.

GitHub backend, commits not ordered with most recent on top, and commits from same push omitted

Hi:

The current GitHub backend does not work as good as the Git backend.
The two major problems from my point of view are that the order of the shown commits is least-recent on top, which is different to the Git backend where the most-recent is shown first.

Furthermore, it looks like commits that were pushed in the same step to github are not shown.
Thus, only the most recent of those commits is shown and the next shown commit seems to be from another push. (That might correlate with dates, too, not necessarily pushs).

Best regards
Stefan

Timeline plots fail when more than 4 plot series are selected

(Reported on the mailing list by Steffen Schwigon)

The timeline view checks whether there are more than 4 data series, and in that case it moves the legend outside the plot area to unclutter it. Unfortunately, when implementing branches the `plotopions' modification got moved before the actual variable declaration:

it just needs to be moved below the var plotoptions line

on Changes tab, git revision information overlaps stddev/change/trend columns

0.8 prelease github master

I took a screenshot but github doesn't allow me to attach it to this issue. I can email it directly to you if you wish.

I believe this issue is due to my use of benchmark names that are longer than 30 chars (the limit set in model.py). Sqlite doesn't care if you go over the defined limit but apparently the django view is expecting the names to be shorter.

Many of my benchmark names are 60-80 chars long (they are descriptive rather than just a simple name like "fixnum").

It would be nice if the page reflowed based upon the benchmark name. To keep things sane, it should probably wrap the name if it exceeds a certain number of chars (e.g. 80).

In the interim, I would be grateful if someone could tell me which template to modify to make it wider.

Allow option to display axis in log scale for time and ratio comparison plots

Sam Mason reported that "when you're showing something that's
taking several seconds and one that's taking less than a second the
smaller one just gets pushed down to zero and all the detail is lost."

It had been considered but discarded as default due to most people inability to correctly interpret log scale. It would be useful as an option, however.

After creating an environment and a default project, /comparison/ fails with a 500 error

To reproduce, create a new codespeed server. Create an environment and a project. Then visit the comparison page. Unlike the other two pages in this configuration (which report a message about requiring at least one executable), the comparison page raises an unhandled exception resulting in a 500.

Exception Type: DoesNotExist
Exception Value: Revision matching query does not exist.

The last codespeed line in the traceback is:

/home/exarkun/Scratch/Sources/tobami-codespeed-f2b86bc/speedcenter/codespeed/views.py in getcomparisonexes
82. rev = Revision.objects.filter(project=proj).latest('date')

try to separate JS from Templates

Just want to check what your stance on this is. Right now, there is a lot of JavaScript code in the templates. I'd favor separating them more, moving (most of) the code into files. I think that can ease maintenance of the code, page size is decreasing, clients can cache (possibly minified and bundled) javascript files. Of course simply moving it won't do it, a lot of code requires setting up data and configuration, but it should be possible to clean this up. What do you think?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.