Git Product home page Git Product logo

pyphot's Introduction

pyphot -- A tool for computing photometry from spectra

https://static.pepy.tech/badge/pyphot https://static.pepy.tech/badge/pyphot/month

This is a set of tools to compute synthetic photometry in a simple way, ideal to integrate in larger projects.

full documentation at: http://mfouesneau.github.io/pyphot/

The inputs are photonic or energetic response functions for the desired photometric bands and stellar spectra. The modules are flexible to handle units in the wavelength definition through a simplified version of pint (link) and astropy.units

Filters are represented individually by a Filter object. Collections of filters are handled with a Library. We provide an internal library that contains a signitificant amount of common filters.

Each filter is minimally defined by a wavelength and throughput. Many properties such as central of pivot wavelength are computed internally. When units are provided for the wavelength, zero points in multiple units are also accessible (AB, Vega magnitude, Jy, erg/s/cm2/AA). The default detector type is assumed to be photonic, but energetic detectors are also handled for the computations.

Note

cphot is a spin-off project that provides a C++ version of pyphot

What's new?

  • [November 22, 2021] new filters, SVO interface, automated tests and documentation.
  • [November 6, 2019] astropy version available in beta (from pyphot import astropy as pyphot)
  • [April 29, 2019] sandbox contains fully unit aware passbands and lick indices libraries
  • [April 15, 2019] merged UncertainFilter to main, sandbox contains passbands accounting for spectrum units
  • [March 4, 2019] added flux calculations in photon/s/cm2
  • [March 4, 2019] added many properties per filter (lphot, lmin, lmax)
  • [June 12, 2018] adding Sun reference spectra (see :class:Sun)
  • [Apr. 26, 2018] includes Gaia nominal, DR2 and revised DR2 passbands

Installation

Before installation, make sure you have HDF5 version 1.8.4 or above (this is required for pytables, see details at: https://github.com/PyTables/PyTables).

For OSX:

brew install hdf5

For Debian-based distributions:

sudo apt-get install libhdf5-serial-dev
  • Installation from PyPI
pip install pyphot
  • Manual installation

download the repository and run the setup

python setup.py install

Contributors

Author:

Morgan Fouesneau

Direct contributions to the code base:

  • Tim Morton (@timothydmorton)
  • Ariane Lancon (@lancon)

pyphot's People

Contributors

jvines avatar klukosiute avatar lancon avatar mfouesneau avatar timothydmorton avatar yqiuu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

pyphot's Issues

ezunits equivalency for the Kelvin

The ezunits package defines the following equivalency (line 42 of default_en.txt):
degK = [temperature] = K = kelvin
degK is invalid for the absolute temperature scale and this usage should be discouraged.

TypeError: only integer scalar arrays can be converted to a scalar index

Hi,

I was trying to use PyPhot to do synthetic photometry for a star, for which I have the calibrated spectra.
The wavelength values are stored in the array: lamb and the calibrated flux values are stored in the array spectra. I was following the quick start guide given in the documentation and the code gives me the error: TypeError: only integer scalar arrays can be converted to a scalar index

import numpy as np
import pyphot

lib = pyphot.get_library()
f = lib['SDSS_r']

fluxes = f.get_flux(lamb, spectra, axis=-1)
Warning: assuming units are consistent
--------------------------------------------------------------------------
TypeError                                Traceback (most recent call last)
<ipython-input-44-bc534503aad0> in <module>()
----> 1 fluxes = f.get_flux(lamb, spectra, axis=-1)

/home/user/.local/lib/python2.7/site-packages/pyphot/phot.pyc in get_flux(self, slamb, sflux, axis)
    248             Energy of the spectrum within the filter
    249         """
--> 250         return self.getFlux(slamb, sflux, axis=axis)
    251 
    252     def reinterp(self, lamb):

/home/user/.local/lib/python2.7/site-packages/pyphot/phot.pyc in getFlux(self, slamb, sflux, axis)
    214                 _sflux = sflux[:, ind]
    215             except:
--> 216                 _sflux = sflux[ind]
    217             # limit integrals to where necessary
    218             ind = ifT > 0.

TypeError: only integer scalar arrays can be converted to a scalar index

When searched online I saw that the issue appears to be related to the latest version of NumPy.
The version of NumPy that I am using is 1.14.5.

Can anyone suggest me a fix, please?

filter cleaning too harsh

In the filter get_flux (phot.py l.180) trimming values gets rid of all 0, missing the last pixels sometimes.

Instantiating empty library, adding filters and writing to HDF5 produces strange results

Hi!

We're (me and @sundarjhu) seeing some strange behaviour when trying to create new libraries. We are instantiating an empty UnitHDF_Library, then adding a number of UnitFilters to it using add_filter(), and trying to write it out. When attempting to read back in with get_library_content() or load_all_filters() we get the following exceptions:

Traceback (most recent call last):
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/pyphot/sandbox.py", line 1456, in get_library_content
    filters = s.hdf.root.content.cols.TABLENAME[:]
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/tables/group.py", line 836, in __getattr__
    return self._f_get_child(name)
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/tables/group.py", line 708, in _f_get_child
    self._g_check_has_child(childname)
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/tables/group.py", line 393, in _g_check_has_child
    raise NoSuchNodeError(
tables.exceptions.NoSuchNodeError: group ``/`` does not have a child named ``content``

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/pyphot/sandbox.py", line 1458, in get_library_content
    filters = list(s.hdf.root.filters._v_children.keys())
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/tables/group.py", line 836, in __getattr__
    return self._f_get_child(name)
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/tables/group.py", line 708, in _f_get_child
    self._g_check_has_child(childname)
  File "/scratch/home/psciclun/anaconda3/envs/sedutils/lib/python3.8/site-packages/tables/group.py", line 393, in _g_check_has_child
    raise NoSuchNodeError(
tables.exceptions.NoSuchNodeError: group ``/`` does not have a child named ``filters``

and opening the resulting library file with h5py to examine the contents shows that the file only appears to contain the root group, and indeed no filters group has been created (not does it seem to contain any datasets).

This is in a new conda environment with pyphot v1.1. Are we missing some step?

My own set of filters

Hi,

How can I include my own set of filters in the pyphot library? I have the transmission profiles. These are for CCD detectors.

Thanks

References for response curves

The origin of the filter/response curves would be really useful from a traceability/reproduciblity standpoint (e.g, references). Is this information available somewhere? I looked in the docs, but did not find it easily.

import error

Hi,
I just installed the package with pip and once I tried to import it I get the following error. Any idea how to fix this?

`In [2]: import pyphot

ImportError Traceback (most recent call last)
in ()
----> 1 import pyphot

/usr/local/lib/python2.7/site-packages/pyphot/init.py in ()
----> 1 from .phot import (Library, Ascii_Library, HDF_Library, Filter, get_library)
2 from .licks import (LickLibrary, LickIndex, reduce_resolution)
3 from .ezunits import unit
4 from .helpers import (STmag_from_flux, STmag_to_flux, extractPhotometry,
5 extractSEDs, fluxErrTomag, fluxToMag, magErrToFlux,

/usr/local/lib/python2.7/site-packages/pyphot/phot.py in ()
17 from future import print_function, division
18 import numpy as np
---> 19 import tables
20 from scipy.integrate import trapz
21

ImportError: No module named tables`

Kepler and/or TESS filters?

[Enchancement and question!]
Would it be possible to add Kepler or TESS filters to the filter library? And on a slightly different note, what information do I need to add a custom filter myself and save it in the HDF file for later re-use??

Photometry on a spectra according to the transmission curve of a certain passband

Hello!

I was wondering whether the pyphot could do jobs like this:
I finished SED fitting of a galaxy and get the best-fit model spectra.
And I want to infer the rest-frame UVJ magnitudes of the galaxy with the spectra I compute.
Anyway, before doing it by myself, I want to have a look at whether there is well-developed code for this task.

Could you please give me some tips?

Anything from you will be appreciated! Thanks in advance!

: )

issue with filter profiles?

Hi,

Can I ask how did you define the filters in Pyphot, because I noticed that at least some of them do not match with archival values?

I made a plot of a 20,000K blackbody and then plotted the transmission curves of you Ground_Johnson_V and HST_WFPC2_F555W with the corresponding filters from the SVO Filter Profile Service, and they do not match. Most importantly the f.transmit() for the Pyphot Ground Johnson V filter shows that the filter has a bandwidth of over 3000 Angstrom and it peaks around 3000 Angstrom.

I am attaching a copy of that plot

figure_1

extractPhotometry method expects 2d array

Even though extractPhotometry is supposed to compute synthetic photometry for a single spectrum (the dostring specifies that spec is a 1d array), the code expects a 2d array.

For example,

import pyphot
vega = pyphot.vega.Vega()
fL = pyphot.UnitHDF_Library().load_all_filters(lamb = vega.wavelength)
pyphot.extractPhotometry(vega.wavelength, vega.flux, fL)

results in the following message:

Photometry:|------------------------------------------------------------------------------------------| 0/271  0% [time: 00:00, eta: ?, ? iters/sec]Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.9/site-packages/pyphot/helpers.py", line 62, in extractPhotometry
    s0  = spec[:, xl]
  File "/usr/local/lib/python3.9/site-packages/pyphot/ezunits/pint.py", line 1143, in __getitem__
    value = self._magnitude[key]
IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed

Line 62 in helpers.py could be changed from
s0 = spec[:, xl]
to either
s0 = spec[xl]
or
s0 = np.atleast_2d(spec)[:, xl]

@pscicluna

leff and lphot do not convert wavelength units

Steps to reproduce

  1. Get a transmission curve in nanometers
  2. Pass it to the constructor of Filter
  3. Get that new filter's leff and lphot

Expected behaviour:
Like the other derived quantities, lphot and leff are also in nanometers and fall within the filter's band.

Observed behaviour:
The leff and lphot are a factor 10 too large; in other words, the value is in Ångström but the Quantity has units nanometer.

Minimal example

(using filters from the library as an example, although of course the point is that this can happen when creating new filters)

import pyphot

# would work with any filters, these are available for the example
lib = pyphot.get_library()

for filter_name in ['ZTF_g', # in nanometers
                    'SDSS_g', # in Angstrom
                   ]:
    print(filter_name)
    
    # In the HDF5 file, lphot and leff are correct
    original = lib[filter_name]
    
    # Now construct a new filter based on this
    reloaded = pyphot.phot.Filter(
        # all arguments the constructor accepts:
        original.wavelength,
        original.transmit,
        name = 'test_Filter_constructor',
        dtype='photon',
        unit=original.wavelength_unit,
    )
    print(f'lphot {original.lpivot:.2f} turns to {reloaded.lphot:.2f}')
    print(f'leff {original.leff:.2f} turns to {reloaded.leff:.2f}')

Error when calling load_all_filters

The following code results in an AttributeError:

from pyphot import sandbox as pyphot
a = pyphot.get_Library()
f = a.load_all_filters()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.9/site-packages/pyphot/sandbox.py", line 1480, in load_all_filters
    filters = [s._load_filter(fname, interp=interp, lamb=lamb)
  File "/usr/local/lib/python3.9/site-packages/pyphot/sandbox.py", line 1480, in <listcomp>
    filters = [s._load_filter(fname, interp=interp, lamb=lamb)
  File "/usr/local/lib/python3.9/site-packages/pyphot/sandbox.py", line 1433, in _load_filter
    fnode = ftab.get_node('/filters/' + fname)
AttributeError: 'NoneType' object has no attribute 'get_node'

The reason seems to be that the hdf attribute is set to NoneType when instantiating UnitHDF_Library:

type(a.hdf)
<class 'NoneType'>

As a workaround, I've used the following successfully:

f = a.load_all_filters(a.get_library_content())
f = a.load_all_filters(a.content())

I'm probably missing something when instantiating the UnitHDF_Library, please let me know. Thanks!

Astropy units?

[an enhancement request]
Are there plans to move to astropy units in the future?

Reference/Use of filter library

Hello! I am planning on writing a filter library similar to this one in Julia. I would like you use your new_filters.hd5 library in my package as well, and the MIT license seems permissive enough to allow this. If you'd like, though, perhaps you could upload the data to Zenodo, which would (1) provide consistent hosting and CDN and (2) would attach a DOI allowing direct citation of the data. If you end up using Zenodo I will add an appropriate citation for the data.

Incorrect transmission curve?

Hi,

I was working with pyphot and I found that when plotting the transmission curve for GROUND_JOHNSON_B I get the following:

download

which is obviously incorrect.

Cheers,

Jose V.

TypeError when querying SVO filter service

Running svo.get_pyphot_astropy_filter("GAIA/GAIA3.Gbp") gives the following error:

File "\Anaconda3\lib\site-packages\pyphot\svo.py", line 55, in get_pyphot_astropy_filter
    name=params['filterID'].replace('/', '_'),

TypeError: a bytes-like object is required, not 'str'

Broken github workflow

For some reason, the workflow does not work anymore. The documentation claims an outdated branch and pypi does not work either.

Not using unit aware classes, loading the included dat files with Filter.from_ascii can result in different zero points than the HDF5 file

I'm sorry the title is a bit vague, I was trying to cover two possibilities because I'm not sure about the expected behaviour. The docs say

the transmission is unitless. Its scaling factor will not affect the flux/magnitude calculations as these are implicitly normalizing the passband (see the equations) but will affect the number of photon photons/s/cm2

So if I understand that right, the issue is either

  • "The ZTF transmission curve throughputs are in percent instead of in [0, 1]" or
  • "from_ascii makes /(AB|ST|Vega)_zero_Jy/ inversely proportional to the transmission curve normalization"
    Interestingly, the one quantity in photons (mentioned above) which is meant to change only changes by a rounding error, along with the other Vega system zero points.

I understand this doesn't have high priority because _zero_flux and _zero_mag are the ones used for flux/magnitude calculations. But maybe it's still helpful to report.

Steps to reproduce:

  1. Load all the filters in libs/ascii_sources with Ascii_Library
  2. Use get_library to load libs/new_filters.hdf5
  3. Compare two ZTF filters from either.

Expected behaviour:
Their zero points (etc.) are identical within rounding error.

Observed behaviour:
AB_zero_Jy, ST_zero_Jy, Vega_zero_Jy are reduced by a factor 100

Minimal example

import pyphot
import os
import numpy as np

# Using provided library for comparison
lib = pyphot.get_library()
# Reloading and storing in dictionary
lib_reload = {}
pyphot_dir = os.path.dirname(pyphot.__file__)
pyphot_filter_dir = os.path.join(pyphot_dir, 'libs/ascii_sources/')
for _f in pyphot.phot.Ascii_Library(pyphot_filter_dir).load_all_filters():
    lib_reload[_f.name] = _f

# e.g. ZTF filter dat files are in percent
for filter_name in[ 'ZTF_g', 'ZTF_r', 'ZTF_i']:
    print(f'Filter: {filter_name}')
    
    filters = [_lib[filter_name] for _lib in [lib, lib_reload]]
    
    zeropoint_keys = []
    for _q in ['AB', 'ST', 'Vega']:
        for _u in ['mag', 'Jy', 'flux']:
            _k = f'{_q}_zero_{_u}'
            zeropoint_keys.append(_k)
    zeropoint_keys.append('Vega_zero_photons')

    for _k in zeropoint_keys:
        # get the zero point from both the original and reloaded filter
        quantities = [_f.__getattribute__(_k) for _f in filters]
        
        # accommodate both those with units and the unitless i.e. magnitudes
        # to help with printing
        if isinstance(quantities[0], pyphot.unit.Quantity):
            values = tuple([_q.magnitude for _q in quantities])
            unit = quantities[0].unit
        else:
            values = tuple(quantities)
            unit = ''

        # NumPy uses tolerances approximately to the effect of 
        # "numbers are close both in relative and absolute difference"
        match = np.isclose(*values, rtol=1e-10, atol=1e-10)
        
        print(f'{_k} is',
              [f'different: \n{values} {unit}', 'identical'][match],
             )

Link to SVO website

It would be nice if one could query the SVO website filters. (though some are incorrectly defined, so caution)

ZTF filters are broken

It seems ZTF filters are currently broken.

For ZTF_g, the problem is missing TAB between 'WAVELENGTH_UNIT' and 'nanometer' in ASCII file, which prevents correct reading of wavelength units.

For ZTF_i, the entries in ASCII file are in descending order, which breaks numerical integration. It may be either fixed on a file level, or more generically - in Filter class constructor like that:

diff --git a/pyphot/phot.py b/pyphot/phot.py
index 8cfb2f0..120ca21 100644
--- a/pyphot/phot.py
+++ b/pyphot/phot.py
@@ -129,6 +129,11 @@ class Filter(object):
             self._wavelength = wavelength
         self.set_wavelength_unit(unit)
         self.transmit   = np.clip(transmit, 0., np.nanmax(transmit))
+
+        aidx = np.argsort(self._wavelength)
+        self._wavelength = self._wavelength[aidx]
+        self.transmit = self.transmit[aidx]
+
         self.norm       = trapz(self.transmit, self._wavelength)
         self._lT        = trapz(self._wavelength * self.transmit, self._wavelength)
         self._lpivot    = self._calculate_lpivot()

Both of these errors are propagated to default new_filters.hd5 library.

ZTF_r loads correctly, but its effective wavelength is order of magnitude larger than it should be (which seems to be a generic problem for all filters defined with 'nanometer' units - at least for Gaia_MAW_G it is the same)

Filter object information:
    name:                 ZTF_r
    detector type:        photon
    wavelength units:     nanometer
    central wavelength:   643.692086 nanometer
    pivot wavelength:     642.116699 nanometer
    effective wavelength: 6339.593128 nanometer
    photon wavelength:    6370.971519 nanometer
    minimum wavelength:   560.080000 nanometer
    maximum wavelength:   731.660000 nanometer
    norm:                 14739.212505
    effective width:      151.525747 nanometer
    fullwidth half-max:   155.710000 nanometer
    definition contains 3201 points

Publication of the code to provide a Reference for other papers

Hi pyphot developers:

I am one of the data editors for the AAS Journals. We are seeing pyphot mentioned in some of our submissions. Do you have a preferred citation for this package that we can recommend authors use in their reference lists?

thanks,

  • gus muench

With regard adding new filters

Hi,

I recently added some filters to pyphot, but when calling the info method it shows an AttributeError, stating that it needs wavelength units. I fear I may have missed something while adding the filters. How could I go around to fix this?

Thanks in advance.

pyfits is not included in module dependencies

After pyphot installation via pip and trying to import pyphot module via ipython I tried to run following example script copied from documentation:

import pyphot
# get the internal default library of passbands filters
lib = pyphot.get_library()
print("Library contains: ", len(lib), " filters")
# find all filter names that relates to IRAC
# and print some info
f = lib.find('irac')
for name in f:
    lib[name].info(show_zeropoints=True)

However, I have ended up with rather enigmatic traceback:

[gszasz@nova scratch]$ ./pyphot-test.py 
Traceback (most recent call last):
  File "./pyphot-test.py", line 4, in <module>
    import pyphot
ImportError: No module named 'pyphot

When I tried to import the pyphot module in ipython, I have finally received the real reason why it does not work:

[gszasz@nova ~]$ ipython
Python 2.7.5 (default, Apr 10 2015, 08:09:05) 
Type "copyright", "credits" or "license" for more information.

IPython 0.13.2 -- An enhanced Interactive Python.
?         -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help      -> Python's own help system.
object?   -> Details about 'object', use 'object??' for extra details.

In [1]: import pyphot
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-748f71ce1480> in <module>()
----> 1 import pyphot

/usr/lib/python2.7/site-packages/pyphot/__init__.py in <module>()
----> 1 from .phot import (Library, Ascii_Library, HDF_Library, Filter, get_library)
      2 from .licks import (LickLibrary, LickIndex, reduce_resolution)
      3 from .ezunits import unit
      4 from .helpers import (STmag_from_flux, STmag_to_flux, extractPhotometry,
      5                       extractSEDs, fluxErrTomag, fluxToMag, magErrToFlux,

/usr/lib/python2.7/site-packages/pyphot/phot.py in <module>()
     20 from scipy.integrate import trapz
     21 
---> 22 from .simpletable import SimpleTable
     23 from .ezunits import hasUnit, unit
     24 from .vega import Vega

/usr/lib/python2.7/site-packages/pyphot/simpletable.py in <module>()
     52     from astropy.io import fits as pyfits
     53 except ImportError:
---> 54     import pyfits
     55 except:
     56     pyfits = None

ImportError: No module named pyfits

Effective wavelength

Is there a method to calculate effective wavelengths by convolving with observed spectrum rather than archived Vega spectrum?

【Question】How to calculate Luminosity (unit :erg/s) from a given apparent magnitude of a filter?

Hello,Can I ask a simple question here
If I have a source with a magnitude within a given filter(such as SDSS g band), I can get the zeropoint of this filter using "pyphot",and convert magnitude to Flux (F_lambda),with a unit erg/cm^2/A/s 。 Then I can calculate Luminosity by using this formula L_lambda = 4piD^2*F_lambda, the unit of Luminosity is erg /A /s now. But how can I turn it to erg/s ? By multiply the integration of filter transmission curve ? Different photometric systems have different different defination of flux density , and I'm confused

f.get_flux(lamb, spectra)

Good afternoon!
I have some doubts about the f.get_flux command. Does this command perform a filter interpolation on the object's spectrum data, or just calculate the integrated flux through the filter f?

And another doubt. I would like to include Saburu's filters, Hyper Suprime-Cam: Filters, in Pyphot's filter library. Is there a way to do this without using the line "tophat = Filter(wave, transmit, name='tophat', dtype='photon', unit='Angstrom')"?

References of the Lick indices

I have been trying to relate the calculated lick indices to their references but I am not finding it directly anywhere. Specifically, what is the 'LB13' reference for the indexes "Ca1_LB13", "Ca2_LB13" and "Ca3_LB13"? Maybe adding a "Ref" column to the band table could specify the references to each index.

Make code pip installable?

Hi, I'm really enjoying using your code, it makes my life a lot easier. It would be great for users if it was packaged and uploaded to PyPI, so that it can be automatically installed just via pip install pyphot and also so that it can be listed in a requirements file and installed automatically.

The repository looks well-organized and like it's basically ready to go and wouldn't require that much work to make this happen. Would you mind if I tried going ahead and packaging and uploading to test.pypi.org to see if it would work? If it works, then you can upload to the real index?

minor updates for consistency

function line 73:
sigma0 is a fwhm / resolution. maybe rename variable "res0" ou "fwhm0" + explicit documentation. (line 113 converts it to std-dev units)

line 328, indcont is a subset of w. Need to replace <, > by <= and >=. Check for truncation of values.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.