Git Product home page Git Product logo

alexnet-experiments-keras's People

Contributors

ankitaggarwal011 avatar duggalrahul avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

alexnet-experiments-keras's Issues

choice of mean values in mean_subtract() function

How was the mean values of BGR decided?
In the mean_subtract() function, how are the values 123.68, 116.779 and 103.939 decided?
def mean_subtract(img):
img = T.set_subtensor(img[:,0,:,:],img[:,0,:,:] - 123.68)
img = T.set_subtensor(img[:,1,:,:],img[:,1,:,:] - 116.779)
img = T.set_subtensor(img[:,2,:,:],img[:,2,:,:] - 103.939)

return img / 255.0

And how do I customize it?

Generation of confusion matrix and roc curve

Many thanks for your post. Iā€™m following your alexnet code as a feature extractor for my binary medical image classification. Iā€™m using tensorflow backend. Could you please let know the code to compute confusion matrix and roc auc scores? It would be extremely helpful.

importing errors?

AT the very first code of AlexNet_Experiments.ipynb I just rub cell[1] and got the followig errors.
Plz help me.
---int[1]---
from keras.preprocessing.image import ImageDataGenerator
from keras.optimizers import SGD

from alexnet_base import *
from utils import *

---error message---
Using Theano backend.
ImportError Traceback (most recent call last)
in ()
2 from keras.optimizers import SGD
3
----> 4 from alexnet_base import *
5 from utils import *

~\DeepLeanviaKreasTheano\Code\alexnet_base.py in ()
11 Input, merge, Lambda
12 from keras.layers.convolutional import Convolution2D, MaxPooling2D, ZeroPadding2D
---> 13 from convnetskeras.customlayers import convolution2Dgroup, crosschannelnormalization,
14 splittensor, Softmax4D
15

~\DeepLeanviaKreasTheano\convnets-keras\convnetskeras\customlayers.py in ()
1 import numpy as np
----> 2 from keras.layers.core import Lambda, Merge
3 from keras.layers.convolutional import Convolution2D
4 from keras import backend as K
5

ImportError: cannot import name 'Merge'

ValueError: output of generator should be a tuple (x, y, sample_weight) or (x, y). Found: None

Hello,

I am using Python 2.7, theano 0.9 and keras 1.2 but I am getting the following errors for line 5:

"Exception in thread Thread-32:
Traceback (most recent call last):
File "C:\Users\Smir\Anaconda2\lib\threading.py", line 801, in __bootstrap_inner
self.run()
File "C:\Users\Smir\Anaconda2\lib\threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "C:\Users\Smir\Anaconda2\lib\site-packages\keras\engine\training.py", line 429, in data_generator_task
generator_output = next(self._generator)
File "C:\Users\Smir\Anaconda2\lib\site-packages\keras\preprocessing\image.py", line 832, in next
target_size=self.target_size)
File "C:\Users\Smir\Anaconda2\lib\site-packages\keras\preprocessing\image.py", line 294, in load_img
raise ImportError('Could not import PIL.Image. '
ImportError: Could not import PIL.Image. The use of array_to_img requires PIL.


ValueError Traceback (most recent call last)
in ()
9 nb_val_samples=800,
10 nb_epoch=80,
---> 11 verbose=1)

C:\Users\Smir\Anaconda2\lib\site-packages\keras\engine\training.pyc in fit_generator(self, generator, samples_per_epoch, nb_epoch, verbose, callbacks, validation_data, nb_val_samples, class_weight, max_q_size, nb_worker, pickle_safe, initial_epoch)
1530 '(x, y, sample_weight) '
1531 'or (x, y). Found: ' +
-> 1532 str(generator_output))
1533 if len(generator_output) == 2:
1534 x, y = generator_output

ValueError: output of generator should be a tuple (x, y, sample_weight) or (x, y). Found: None
"

I would be very grateful if anyone could help me solve this issue.

Thanking you.

Kind regards.

Very high training accuracy but a very low validation accuracy

Hi,

I am getting a very high training accuracy and it seems to be improving per epoch but a fairly low validation accuracy on a different dataset that I am using on pretrained alexnet code supplied. I have 1598 training observations and 140 validation observations split into 2 classes with equal size.

Below are the results from some epochs:

Epoch 1/1
1598/1598 [==============================] - 114s - loss: 0.0889 - acc: 0.8755 - val_loss: 0.2725 - val_acc: 0.5857
Epoch 1/1
1598/1598 [==============================] - 110s - loss: 0.0848 - acc: 0.8874 - val_loss: 0.2937 - val_acc: 0.5286
Epoch 1/1
1598/1598 [==============================] - 114s - loss: 0.0859 - acc: 0.8867 - val_loss: 0.2760 - val_acc: 0.6143
Epoch 1/1
1598/1598 [==============================] - 122s - loss: 0.0787 - acc: 0.8949 - val_loss: 0.2841 - val_acc: 0.5571
Epoch 1/1
1598/1598 [==============================] - 115s - loss: 0.0825 - acc: 0.8880 - val_loss: 0.2466 - val_acc: 0.5786
Epoch 1/1
1598/1598 [==============================] - 113s - loss: 0.0765 - acc: 0.9024 - val_loss: 0.2792 - val_acc: 0.6429
Epoch 1/1
1598/1598 [==============================] - 104s - loss: 0.0774 - acc: 0.9011 - val_loss: 0.2972 - val_acc: 0.5500

Any idea what could be the issue? Is the model overfitting the training set? I could see drop outs being implemented but still why is it overfitting? How can I improve the validation accuracy?

Training the model by GPU quickly eats up my memory

When I tried to train the Keras alexnet experiment model using GPU, it quickly eats up my system memory after 6 epoch, and I tried to reduce the sample number from 1024 to 256 per epoch, it lasted a bit longer time then eats up my system memory again.

I have trained several Alexnet models using caffe in Ubuntu, the system memory usage stabilzes to 20%-30% and never goes burst, because it uses GPU memory instead of system memory, so I think in Keras there must be some other reason using the system memory along with training process, but I canot figure out what reason, please help !

AlexNet weights are not present

Hi, I am trying to download the weights of alexnet from the given link however, I am unable to do so. Can you please share the weights of alexnet?

ValueError: Dimensions must be equal, but are 27 and 26

Traceback (most recent call last):

File "", line 1, in
runfile('C:/Users/Lab PC/Documents/PythonWorkspace/convnets-keras-master/test.py', wdir='C:/Users/Lab PC/Documents/PythonWorkspace/convnets-keras-master')

File "C:\ProgramData\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 678, in runfile
execfile(filename, namespace)

File "C:\ProgramData\Anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 106, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)

File "C:/Users/Lab PC/Documents/PythonWorkspace/convnets-keras-master/test.py", line 14, in
model = convnet('alexnet',weights_path="weights/alexnet_weights.h5", heatmap=False)

File "C:\Users\Lab PC\Documents\PythonWorkspace\convnets-keras-master\convnetskeras\convnets.py", line 83, in convnet
convnet = convnet_init(weights_path, heatmap=False)

File "C:\Users\Lab PC\Documents\PythonWorkspace\convnets-keras-master\convnetskeras\convnets.py", line 248, in AlexNet
conv_2 = crosschannelnormalization(name='convpool_1')(conv_2)

File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 569, in call
self.add_inbound_node(inbound_layers, node_indices, tensor_indices)

File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 632, in add_inbound_node
Node.create_node(self, inbound_layers, node_indices, tensor_indices)

File "C:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py", line 164, in create_node
output_tensors = to_list(outbound_layer.call(input_tensors[0], mask=input_masks[0]))

File "C:\ProgramData\Anaconda3\lib\site-packages\keras\layers\core.py", line 596, in call
return self.function(x, **arguments)

File "C:\Users\Lab PC\Documents\PythonWorkspace\convnets-keras-master\convnetskeras\customlayers.py", line 24, in f
scale += alpha * extra_channels[:, i:i + ch, :, :]

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\ops\math_ops.py", line 979, in binary_op_wrapper
return func(x, y, name=name)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\ops\gen_math_ops.py", line 296, in add
"Add", x=x, y=y, name=name)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 3392, in create_op
op_def=op_def)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 1734, in init
control_input_ops)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\framework\ops.py", line 1570, in _create_c_op
raise ValueError(str(e))

ValueError: Dimensions must be equal, but are 27 and 26 for 'add_2' (op: 'Add') with input shapes: [?,27,27,100], [?,26,27,100].

ValueError: scale < 0

/home/c410/anaconda3/lib/python3.6/site-packages/k**eras/lay**ers/core.py:577: UserWarning: output_shapeargument not specified for layer mean_subtraction and cannot be automatically inferred with the Theano backend. Defaulting to output shape(None, 3, 227, 227)(same as input shape). If the expected output shape is different, specify it via theoutput_shape` argument.
.format(self.name, input_shape))
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)

ValueError Traceback (most recent call last)
in ()
----> 1 alexnet = get_alexnet(input_size,nb_classes,mean_flag)
2
3 print(alexnet.summary())

~/Desktop/AlexNet/AlexNet-Experiments-Keras-master/Code/alexnet_base.py in get_alexnet(input_shape, nb_classes, mean_flag)
64
65 dense_1 = Flatten(name="flatten")(dense_1)
---> 66 dense_1 = Dense(4096, activation='relu',name='dense_1',init='he_normal')(dense_1)
67 dense_2 = Dropout(0.5)(dense_1)
68 dense_2 = Dense(4096, activation='relu',name='dense_2',init='he_normal')(dense_2)

~/anaconda3/lib/python3.6/site-packages/keras/engine/topology.py in call(self, x, mask)
541 'layer.build(batch_input_shape)')
542 if len(input_shapes) == 1:
--> 543 self.build(input_shapes[0])
544 else:
545 self.build(input_shapes)

~/anaconda3/lib/python3.6/site-packages/keras/layers/core.py in build(self, input_shape)
750 name='{}_W'.format(self.name),
751 regularizer=self.W_regularizer,
--> 752 constraint=self.W_constraint)
753 if self.bias:
754 self.b = self.add_weight((self.output_dim,),

~/anaconda3/lib/python3.6/site-packages/keras/engine/topology.py in add_weight(self, shape, initializer, name, trainable, regularizer, constraint)
413 '''
414 initializer = initializations.get(initializer)
--> 415 weight = initializer(shape, name=name)
416 if regularizer is not None:
417 self.add_loss(regularizer(weight))

~/anaconda3/lib/python3.6/site-packages/keras/initializations.py in he_normal(shape, name, dim_ordering)
66 fan_in, fan_out = get_fans(shape, dim_ordering=dim_ordering)
67 s = np.sqrt(2. / fan_in)
---> 68 return normal(shape, s, name=name)
69
70

~/anaconda3/lib/python3.6/site-packages/keras/initializations.py in normal(shape, scale, name)
35
36 def normal(shape, scale=0.05, name=None):
---> 37 return K.random_normal_variable(shape, 0.0, scale, name=name)
38
39

~/anaconda3/lib/python3.6/site-packages/keras/backend/theano_backend.py in random_normal_variable(shape, mean, scale, dtype, name)
181
182 def random_normal_variable(shape, mean, scale, dtype=None, name=None):
--> 183 return variable(np.random.normal(loc=0.0, scale=scale, size=shape),
184 dtype=dtype, name=name)
185

mtrand.pyx in mtrand.RandomState.normal()

ValueError: scale < 0
`

My keras aready is release 1.2, jason to configrationed! But runing code show to error above.
Why?

I am stuck by this problem in customlayers.py, please help

40 alexnet = get_alexnet(input_size,nb_classes,mean_flag)
41
42 print (alexnet.summary())

E:\TEMP\R\KERAS\AlexNet-Experiments-Keras-master\AlexNet-Experiments-Keras-master\Code\alexnet_base.py in get_alexnet(input_shape, nb_classes, mean_flag)
36
37 conv_2 = MaxPooling2D((3, 3), strides=(2,2))(conv_1)
---> 38 conv_2 = crosschannelnormalization(name="convpool_1")(conv_2)
39 conv_2 = ZeroPadding2D((2,2))(conv_2)
40 conv_2 = merge([

d:\ProgramData\Anaconda3\lib\site-packages\keras\engine\topology.py in call(self, inputs, **kwargs)
583
584 # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 585 output = self.call(inputs, **kwargs)
586 output_mask = self.compute_mask(inputs, previous_mask)
587

d:\ProgramData\Anaconda3\lib\site-packages\keras\layers\core.py in call(self, inputs, mask)
657 if 'mask' in arg_spec.args:
658 arguments['mask'] = mask
--> 659 return self.function(inputs, **arguments)
660
661 def compute_mask(self, inputs, mask=None):

E:\TEMP\R\KERAS\AlexNet-Experiments-Keras-master\AlexNet-Experiments-Keras-master\convnets-keras\convnetskeras\customlayers.py in f(X)
16 square = K.square(X)
17 extra_channels = K.spatial_2d_padding(K.permute_dimensions(square,(0,2,3,1))
---> 18 , (0,half))
19 extra_channels = K.permute_dimensions(extra_channels, (0,3,1,2))
20 scale = k

d:\ProgramData\Anaconda3\lib\site-packages\keras\backend\theano_backend.py in spatial_2d_padding(x, padding, data_format)
995 """
996 assert len(padding) == 2
--> 997 assert len(padding[0]) == 2
998 assert len(padding[1]) == 2
999 top_pad, bottom_pad = padding[0]

TypeError: object of type 'int' has no len()

Error when calling get_alexnet function

Hello,

When I run:

alexnet = get_alexnet(input_size,nb_classes,mean_flag)

print (alexnet.summary())

# I am having the following error.

...\AlexNet-Experiments-Keras-master\convnets-keras\convnetskeras\customlayers.py in f(X)
17 square = K.square(X)
18 extra_channels = K.spatial_2d_padding(K.permute_dimensions(square, (0,2,3,1)), (0, half))
---> 19 extra_channels = K.permute_dimensions(extra_channels, (0,3,1,2))
20 scale = k
21 for i in range(n):

C:\ProgramData\Anaconda3\lib\site-packages\keras\backend\theano_backend.py in spatial_2d_padding(x, padding, data_format)
1061 """
1062 assert len(padding) == 2
-> 1063 assert len(padding[0]) == 2
1064 assert len(padding[1]) == 2
1065 top_pad, bottom_pad = padding[0]

Anybody can help me??

Thanks.

Regards.

Why split tensor for layers 2, 4 and 5?

Hello Rahul,

I am new to deep machine learning.

I came across your codes AlexNet-Experiments-Keras on Github. Thank you for the well documented guidelines. It really helped me.

However, I cannot understand why you split the tensor first before performing the convolution for layers 2, 4 and 5? (https://github.com/duggalrahul/AlexNet-Experiments-Keras/blob/master/convnets-keras/convnetskeras/customlayers.py)

Also, why is this process not done on layer 3? (https://github.com/duggalrahul/AlexNet-Experiments-Keras/blob/master/Code/alexnet_base.py)

I would be very grateful if you could answer my question.

Looking forward for your response.

Thanking you in advance.

Kind regards..

Error of padding

Thanks for providing such a great code!
When I run alexnet = get_alexnet(input_size,nb_classes,mean_flag)
It occurs

TypeError                                 Traceback (most recent call last)
<ipython-input-4-aeb62c90d3b4> in <module>()
      1 print(input_size)
----> 2 alexnet = get_alexnet(input_size,nb_classes,mean_flag)
      3 
      4 #print alexnet.summary()

/media/chutz/000FC3F700054C75/AlexNet/AlexNet-Experiments-Keras/Code/alexnet_base.py in get_alexnet(input_shape, nb_classes, mean_flag)
     21 
     22         conv_2 = MaxPooling2D((3, 3), strides=(2,2))(conv_1)
---> 23         conv_2 = crosschannelnormalization(name="convpool_1")(conv_2)
     24         conv_2 = ZeroPadding2D((2,2))(conv_2)
     25 	conv_2 = merge([

~/anaconda3/lib/python3.6/site-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
    617 
    618             # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 619             output = self.call(inputs, **kwargs)
    620             output_mask = self.compute_mask(inputs, previous_mask)
    621 

~/anaconda3/lib/python3.6/site-packages/keras/layers/core.py in call(self, inputs, mask)
    661         if has_arg(self.function, 'mask'):
    662             arguments['mask'] = mask
--> 663         return self.function(inputs, **arguments)
    664 
    665     def compute_mask(self, inputs, mask=None):

/media/chutz/000FC3F700054C75/AlexNet/AlexNet-Experiments-Keras/convnets-keras/convnetskeras/customlayers.py in f(X)
     15         half = n // 2
     16         square = K.square(X)
---> 17         extra_channels = K.spatial_2d_padding(K.permute_dimensions(square, (0,2,3,1)) , (0,half))
     18         extra_channels = K.permute_dimensions(extra_channels, (0,3,1,2))
     19         scale = k

~/anaconda3/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py in spatial_2d_padding(x, padding, data_format)
   2182     """
   2183     assert len(padding) == 2
-> 2184     assert len(padding[0]) == 2
   2185     assert len(padding[1]) == 2
   2186     if data_format is None:

TypeError: object of type 'int' has no len()

and I used debug on jupyter notebook, it shows

ipdb> padding (0, 2)

Any help is appreciated!!

get_alexnet scale error

Hello,
When running the Jupyter notebook, I cannot start alexnet at all, I always get the set of errors atached:

Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)
(Subtensor{int64}.0, Elemwise{add,no_inplace}.0, Elemwise{add,no_inplace}.0, Subtensor{int64}.0)

ValueError Traceback (most recent call last)
in ()
----> 1 alexnet = get_alexnet(input_size,nb_classes,mean_flag)
2
3 #print alexnet.summary()

/home/carlos/alexnet/AlexNet-Experiments-Keras/Code/alexnet_base.pyc in get_alexnet(input_shape, nb_classes, mean_flag)
63
64 dense_1 = Flatten(name="flatten")(dense_1)
---> 65 dense_1 = Dense(4096, activation='relu',name='dense_1',init='he_normal')(dense_1)
66 dense_2 = Dropout(0.5)(dense_1)
67 dense_2 = Dense(4096, activation='relu',name='dense_2',init='he_normal')(dense_2)

/home/carlos/.conda/envs/ipykernel_py2/lib/python2.7/site-packages/keras/engine/topology.pyc in call(self, x, mask)
541 'layer.build(batch_input_shape)')
542 if len(input_shapes) == 1:
--> 543 self.build(input_shapes[0])
544 else:
545 self.build(input_shapes)

/home/carlos/.conda/envs/ipykernel_py2/lib/python2.7/site-packages/keras/layers/core.pyc in build(self, input_shape)
750 name='{}_W'.format(self.name),
751 regularizer=self.W_regularizer,
--> 752 constraint=self.W_constraint)
753 if self.bias:
754 self.b = self.add_weight((self.output_dim,),

/home/carlos/.conda/envs/ipykernel_py2/lib/python2.7/site-packages/keras/engine/topology.pyc in add_weight(self, shape, initializer, name, trainable, regularizer, constraint)
413 '''
414 initializer = initializations.get(initializer)
--> 415 weight = initializer(shape, name=name)
416 if regularizer is not None:
417 self.add_loss(regularizer(weight))

/home/carlos/.conda/envs/ipykernel_py2/lib/python2.7/site-packages/keras/initializations.pyc in he_normal(shape, name, dim_ordering)
66 fan_in, fan_out = get_fans(shape, dim_ordering=dim_ordering)
67 s = np.sqrt(2. / fan_in)
---> 68 return normal(shape, s, name=name)
69
70

/home/carlos/.conda/envs/ipykernel_py2/lib/python2.7/site-packages/keras/initializations.pyc in normal(shape, scale, name)
35
36 def normal(shape, scale=0.05, name=None):
---> 37 return K.random_normal_variable(shape, 0.0, scale, name=name)
38
39

/home/carlos/.conda/envs/ipykernel_py2/lib/python2.7/site-packages/keras/backend/theano_backend.pyc in random_normal_variable(shape, mean, scale, dtype, name)
181
182 def random_normal_variable(shape, mean, scale, dtype=None, name=None):
--> 183 return variable(np.random.normal(loc=0.0, scale=scale, size=shape),
184 dtype=dtype, name=name)
18

mtrand.pyx in mtrand.RandomState.normal()

ValueError: scale < 0

Which seem to be ultimately related to the function random_normal_variable(), my question is how is the scale parameter calculated as it is always taking negative values. Is this a known bug?

Thanks in advance.

Carlos

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    šŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. šŸ“ŠšŸ“ˆšŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ā¤ļø Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.