Git Product home page Git Product logo

cytocommunity's People

Contributors

hubioinfo avatar tanlabcode avatar xuyafei993 avatar yafeixu-xidian avatar zhaohh52 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

cytocommunity's Issues

[CytoCommunity] Using visium data

Dear authors,

Hi, all! Thank you so much sharing great tool to community.
I happen to come across you used DLPFC 10x visium data on CytoCommunity as well but can't find any toy example how you converted raw spaceranger output into current cytocommunity input files. Can you guide us with tutorial or explanation how one can do this?

many thanks in advance,
J

supervised method

Hi

I would like to generate the heatmap as in your paper
Could you please share the code or ideas?

Also, I cannot find where to get the file '..._GraphLabel.txt.'

Thank you!

Get Node _NodeAttr.txt and _GraphIndex.txt files

Hello,

Thank you for providing this interesting tool,

I surely missed something, but I did not understand from the tutorial how I can create the Node Attribute.txt and Graph Index.txt files.

Could you give me hints on how to generate those files ?

Thank you,
Pacôme

Run CytoCommunity on MacOS

Hello, Yuxuan

Thank you for developing the tool!

I can run Cytocommunity very smoothly on Linux but have a hard time running it on my Mac. I wonder if you could please also provide the installation file for MacOS (the yml file for MacOS) so that Mac Users, like me, can install it with pip on Mac. I tried to install it on my Mac PC but ran into a segmentation error when running the Python scripts. I guess the package versions are different.

Thank you!

Best,
Xiyu

Conda Package Installation Failure on Windows Using requirements.txt

Hello Authors,

Thanks for creating this tool.

I'm having trouble installing the Python environment from requirements.txt using Conda. It can't find some packages:

Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.

PackagesNotFoundError: The following packages are not available from current channels:

  - libuv==1.44.2=h8ffe710_0
  - cffi==1.15.1=py310hcbf9ad4_0
  - yaml==0.2.5=h8ffe710_2
  - typing_extensions==4.4.0=pyha770c72_0
  - numpy==1.23.3=py310h4a8f9c9_0
  - six==1.16.0=pyh6c4a22f_0
  - setuptools==65.4.1=pyhd8ed1ab_0
  - vc==14.2=hac3ee0b_8
  - vs2015_runtime==14.29.30139=h890b9b1_8
  - contourpy==1.0.5=py310h232114e_0
  - xorg-libxau==1.0.9=hcd874cb_0
  - libbrotlidec==1.0.9=h8ffe710_7
  - libtiff==4.4.0=h8e97e67_4
  - tqdm==4.64.1=pyhd8ed1ab_0
  - lerc==4.0.0=h63175ca_0
  - jpeg==9e=h8ffe710_2
  - requests==2.28.1=pyhd8ed1ab_1
  - pytorch-mutex==1.0=cpu
  - brotli==1.0.9=h8ffe710_7
  - pip==22.2.2=pyhd8ed1ab_0
  - pcre2==10.37=hdfff0fc_1
  - xorg-libxdmcp==1.1.3=hcd874cb_0
  - intel-openmp==2022.1.0=h57928b3_3787
  - libwebp-base==1.2.4=h8ffe710_0
  - win_inet_pton==1.1.0=py310h5588dad_4
  - matplotlib-base==3.6.0=py310h51140c5_0
  - xz==5.2.6=h8d14728_0
  - yacs==0.1.8=pyhd8ed1ab_0
  - 
  - colorama==0.4.5=pyhd8ed1ab_0
  - liblapack==3.9.0=16_win64_mkl
  - urllib3==1.26.11=pyhd8ed1ab_0
  - pytorch-spline-conv==1.2.1=py310_torch_1.11.0_cpu
  - libiconv==1.17=h8ffe710_0
  - libvorbis==1.3.7=h0e60522_0
  - ply==3.11=py_1
  - pandas==1.5.0=py310h1c4a608_0
  - pyopenssl==22.0.0=pyhd8ed1ab_1
  - scikit-learn==1.1.2=py310h3a564e9_0
  - python==3.10.6=h9a09f29_0_cpython
  - libdeflate==1.14=hcfcfb64_0
  - zstd==1.5.2=h7755175_4
  - tzdata==2022d=h191b570_0
  - python-louvain==0.15=pyhd8ed1ab_1
  - mkl-devel==2022.1.0=h57928b3_875
  - glib==2.74.0=h12be248_0
  - blas==2.116=mkl
  - matplotlib==3.6.0=py310h5588dad_0
  - toml==0.10.2=pyhd8ed1ab_0
  - sip==6.6.2=py310h8a704f9_0
  - pyg==2.0.4=py310_torch_1.11.0_cpu
  - ucrt==10.0.20348.0=h57928b3_0
  - mkl==2022.1.0=h6a75c08_874
  - mkl-include==2022.1.0=h6a75c08_874
  - statsmodels==0.13.2=py310h2873277_0
  - unicodedata2==14.0.0=py310he2412df_1
  - m2w64-pcre2==10.34=0
  - tk==8.6.12=h8ffe710_0
  - libogg==1.3.4=h8ffe710_1
  - qt-main==5.15.6=hf0cf448_0
  - networkx==2.8.7=pyhd8ed1ab_0
  - libglib==2.74.0=h79619a9_0
  - python_abi==3.10=2_cp310
  - jinja2==3.1.2=pyhd8ed1ab_1
  - freetype==2.12.1=h546665d_0
  - wheel==0.37.1=pyhd8ed1ab_0
  - icu==70.1=h0e60522_0
  - pytorch-scatter==2.0.9=py310_torch_1.11.0_cpu
  - pillow==9.2.0=py310h52929f7_2
  - libzlib==1.2.12=hcfcfb64_4
  - pytorch==1.11.0=py3.10_cpu_0
  - zlib==1.2.12=hcfcfb64_4
  - libpng==1.6.38=h19919ed_0
  - markupsafe==2.1.1=py310he2412df_1
  - brotlipy==0.7.0=py310he2412df_1004
  - pyqt==5.15.7=py310hbabf5d4_0
  - pyqt5-sip==12.11.0=py310h8a704f9_0
  - gstreamer==1.20.3=h6b5321d_2
  - pytorch-sparse==0.6.15=py310_torch_1.11.0_cpu
  - tornado==6.2=py310he2412df_0
  - pytorch-cluster==1.6.0=py310_torch_1.11.0_cpu
  - cpuonly==2.0=0
  - bzip2==1.0.8=h8ffe710_4
  - patsy==0.5.3=pyhd8ed1ab_0
  - kiwisolver==1.4.4=py310h476a331_0
  - blas-devel==3.9.0=16_win64_mkl
  - pysocks==1.7.1=pyh0701188_6
  - rpy2==3.5.1=py310r41h3c8b411_1
  - libbrotlicommon==1.0.9=h8ffe710_7
  - tbb==2021.6.0=h91493d7_0
  - pytz==2022.4=pyhd8ed1ab_0
  - pyparsing==3.0.9=pyhd8ed1ab_0
  - r-base==4.1.3=hddad469_1
  - cycler==0.11.0=pyhd8ed1ab_0
  - python-dateutil==2.8.2=pyhd8ed1ab_0
  - charset-normalizer==2.1.1=pyhd8ed1ab_0
  - munkres==1.1.4=pyh9f0ad1d_0
  - libbrotlienc==1.0.9=h8ffe710_7
  - libxcb==1.13=hcd874cb_1004
  - brotli-bin==1.0.9=h8ffe710_7
  - fonttools==4.37.4=py310h8d17308_0
  - liblapacke==3.9.0=16_win64_mkl
  - scipy==1.9.1=py310h578b7cb_0
  - packaging==21.3=pyhd8ed1ab_0
  - glib-tools==2.74.0=h12be248_0
  - pthread-stubs==0.4=hcd874cb_1001
  - gst-plugins-base==1.20.3=h001b923_2
  - lcms2==2.12=h2a16943_0
  - krb5==1.19.3=h1176d77_0
  - openjpeg==2.5.0=hc9384bd_1
  - libcblas==3.9.0=16_win64_mkl
  - joblib==1.2.0=pyhd8ed1ab_0
  - libsqlite==3.39.4=hcfcfb64_0
  - threadpoolctl==3.1.0=pyh8a188c0_0
  - libblas==3.9.0=16_win64_mkl
  - pycparser==2.21=pyhd8ed1ab_0
  - pyyaml==6.0=py310he2412df_4
  - seaborn-base==0.12.0=pyhd8ed1ab_0
  - seaborn==0.12.0=hd8ed1ab_0
  - libffi==3.4.2=h8ffe710_5
  - gettext==0.19.8.1=h5728263_1009
  - idna==3.4=pyhd8ed1ab_0

Current channels:

  - https://repo.anaconda.com/pkgs/main/win-64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/win-64
  - https://repo.anaconda.com/pkgs/r/noarch
  - https://repo.anaconda.com/pkgs/msys2/win-64
  - https://repo.anaconda.com/pkgs/msys2/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.

Do you have any advice for fixing this? Should I use specific channels to get these packages?

Thanks!

Dataset for reproducing results

Hi there,

really cool package and congratulations on the publication.

Some people in my team are really interested in trying to reproduce some of your results, especially on the CODEX spleen dataset. Looking through your "Data Availability Statement" in the publication I noticed how you refer to the original data repository, however, I could not find the compartment annotations here (e.g., "Red Pulp", "Marginal Zone" etc.).

Would it be possible for you to share the data with the cell type and compartment annotations, perhaps as an .h5ad object or something similar. I'm assuming you used something like this to run your method and evaluate your results.

Best,
Alma

Error dimension specified as 0 but tensor has no dimensions in Step 2 - SoftTCNLearning_Supervised.py

Hello,

Now that I have the input in the right place, I manage to run Step 0 and 1 successfully.

Now, in the Step 2 - SoftTCNLearning_Supervised.py when running this line :
model = Net(dataset.num_features, dataset.num_classes).to(device)

I get :

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
Cell In[57], line 2
      1 device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
----> 2 model = Net(dataset.num_features, dataset.num_classes).to(device)  #Initialize model for each fold.
      3 optimizer = torch.optim.Adam(model.parameters(), lr=LearningRate)
      5 FoldFolderName = TimeFolderName + "/Fold" + str(num_fold)

File ~/anaconda3/envs/CytoCommunity/lib/python3.10/site-packages/torch_geometric/data/dataset.py:114, in Dataset.num_features(self)
    110 @property
    111 def num_features(self) -> int:
    112     r"""Returns the number of features per node in the dataset.
    113     Alias for :py:attr:`~num_node_features`."""
--> 114     return self.num_node_features

File ~/anaconda3/envs/CytoCommunity/lib/python3.10/site-packages/torch_geometric/data/dataset.py:103, in Dataset.num_node_features(self)
    100 @property
    101 def num_node_features(self) -> int:
    102     r"""Returns the number of features per node in the dataset."""
--> 103     data = self[0]
    104     data = data[0] if isinstance(data, tuple) else data
    105     if hasattr(data, 'num_node_features'):

File ~/anaconda3/envs/CytoCommunity/lib/python3.10/site-packages/torch_geometric/data/dataset.py:198, in Dataset.__getitem__(self, idx)
    193 if (isinstance(idx, (int, np.integer))
    194         or (isinstance(idx, Tensor) and idx.dim() == 0)
    195         or (isinstance(idx, np.ndarray) and np.isscalar(idx))):
    197     data = self.get(self.indices()[idx])
--> 198     data = data if self.transform is None else self.transform(data)
    199     return data
    201 else:

File ~/anaconda3/envs/CytoCommunity/lib/python3.10/site-packages/torch_geometric/transforms/to_dense.py:51, in ToDense.__call__(self, data)
     48     size = [num_nodes - data.pos.size(0)] + list(data.pos.size())[1:]
     49     data.pos = torch.cat([data.pos, data.pos.new_zeros(size)], dim=0)
---> 51 if data.y is not None and (data.y.size(0) == orig_num_nodes):
     52     size = [num_nodes - data.y.size(0)] + list(data.y.size())[1:]
     53     data.y = torch.cat([data.y, data.y.new_zeros(size)], dim=0)

IndexError: dimension specified as 0 but tensor has no dimensions

It seems that accessing any of the attributes: dataset.num_classes, dataset.num_features, dataset.num_node_features or dataset.num_edge_features produces the same error and it seems to be related to accessing data.y.size(0)

Best,
Pacôme

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.