Git Product home page Git Product logo

metaseq's Introduction

Metaseq

https://travis-ci.org/daler/metaseq.png?branch=master https://img.shields.io/badge/install%20with-bioconda-brightgreen.svg?style=flat-square

Briefly, the goal of metaseq is to tie together lots of existing software into a framework for exploring genomic data. It focuses on flexibility and interactive exploration and plotting of disparate genomic data sets.

The main documentation for metaseq can be found at https://daler.github.io/metaseq.

If you use metaseq in your work, please cite the following publication:

Dale, R. K., Matzat, L. H. & Lei, E. P. metaseq: a Python package for integrative genome-wide analysis reveals relationships between chromatin insulators and associated nuclear mRNA. Nucleic Acids Res. 42, 9158โ€“9170 (2014). http://www.ncbi.nlm.nih.gov/pubmed/25063299

Example 1: Average ChIP-seq signal over promoters

Example 1 walks you through the creation of the following heatmap and line-plot figure:

demo.png

Top: Heatmap of ATF3 ChIP-seq signal over transcription start sites (TSS) on chr17 in human K562 cells. Middle: average ChIP enrichment over all TSSs +/- 1kb, with 95% CI band. Bottom: Integration with ATF3 knockdown RNA-seq results, showing differential enrichment over transcripts that went up, down, or were unchanged upon ATF3 knockdown.

Example 2: Differential expression scatterplots

Example 2 walks you through the creation of the following scatterplot and marginal histogram figure:

expression-demo.png

Control vs knockdown expression (log2(FPKM + 1)) for an ATF3 knockdown experiment. Each point represents one transcript on chromosome 17. Marginal distributions are shown on top and side. 1:1 line shown as a dotted line. Up- and downregulated genes determined by a simple 2-fold cutoff.

Other features

In addition, metaseq offers:

  • A format-agnostic API for accessing "genomic signal" that allows you to work with BAM, BED, VCF, GTF, GFF, bigBed, and bigWig using the same API.
  • Parallel data access from the file formats mentioned above
  • "Mini-browsers", zoomable and pannable Python-only figures that show genomic signal and gene models and are spawned by clicking on features of interest
  • A wrapper around pandas.DataFrames to simplify the manipulation and plotting of tabular results data that contain gene information (like DESeq results tables)
  • Integrates data keyed by genomic interval (think BAM or BED files) with data keyed by gene ID (e.g., Cufflinks or DESeq results tables)

Check out the full documentation for more.

metaseq's People

Contributors

daler avatar olgabot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

metaseq's Issues

metaseq is supported by python 3?

Hi daler, I love your metaseq, and pybedtools. Have you considered to upgrade metaseq, so we could do better data analysis work on python 3?

IndexError AND Value error

Hi again. I was hoping you might help me guess the source of some errors I have been getting. I am trying to make average plots of bigwig histone mark data (25 files), plotted around my TFBS (gff file). I made a python script with all the files included, which I can prune down later. Trying to execute the file gave me this error:
(metaseq-test)hart@hart-ubuntu:
/BigData/gff$ python bf.py
Traceback (most recent call last):
File "bf.py", line 181, in
arrays['H2Av'].mean(axis=0),
File "/home/hart/miniconda/envs/metaseq-test/lib/python2.7/site-packages/numpy/core/_methods.py", line 56, in _mean
rcount = _count_reduce_items(arr, axis)
File "/home/hart/miniconda/envs/metaseq-test/lib/python2.7/site-packages/numpy/core/_methods.py", line 50, in _count_reduce_items
items *= arr.shape[ax]
IndexError: tuple index out of range

After failing to understand the cause of the error. I Just commented out the offending section of code. This allowed the program to finish. However the average plot generated had no signal on it, so I decided to replace the TFBS.gff3 with an older one that I have previously used to make these plots.
Now when running the script I get a new error message:
(metaseq-test)hart@hart-ubuntu:~/BigData/gff$ python bf.py
Traceback (most recent call last):
File "bf.py", line 54, in
processes=4)
File "/home/hart/miniconda/envs/metaseq-test/lib/python2.7/site-packages/metaseq/_genomic_signal.py", line 126, in array
stacked_arrays = np.row_stack(arrays)
File "/home/hart/miniconda/envs/metaseq-test/lib/python2.7/site-packages/numpy/core/shape_base.py", line 230, in vstack
return _nx.concatenate([atleast_2d(_m) for _m in tup], 0)
ValueError: need at least one array to concatenate

So any guesses as to what is causing the errors? Thank you!

Bamsignal attributes

Hi, I am trying to plot some RNA-seq signal in BAM files over an interval. Basically making a heatmap as in example 1only with RNA reads instead of CHiP. I keep getting errors like the following. Looking at the documentation and source code I still can't figure out what attributes a Bamsignal has. How can I make this work?

sort_by=s2bulk_signal.mean(axis=1),

AttributeError: 'BamSignal' object has no attribute 'mean'

Example1 problem #2

Why do I get an AttributeError when I try to create an array in example 1?

import multiprocessing
processes = multiprocessing.cpu_count()

if not os.path.exists('example.npz'):
... ip_array = ip_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
... input_array = input_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
... ip_array /= ip_signal.mapped_read_count() / 1e6
... input_array /= input_signal.mapped_read_count() / 1e6
... metaseq.persistence.save_features_and_arrays(
... features=tsses,
... arrays={'ip':ip_array, 'input':input_array},
... prefix='example',
... link_features=True,
... overwrite=True)
...
Traceback (most recent call last):
File "", line 5, in
File "/Users/nakadalab/miniconda/lib/python2.7/site-packages/metaseq/_genomic_signal.py", line 122, in array
chunksize=chunksize, **kwargs)
File "/Users/nakadalab/miniconda/lib/python2.7/site-packages/metaseq/array_helpers.py", line 383, in _array_parallel
itertools.repeat(kwargs)))
File "/Users/nakadalab/miniconda/lib/python2.7/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/Users/nakadalab/miniconda/lib/python2.7/multiprocessing/pool.py", line 567, in get
raise self._value
AttributeError: 'NoneType' object has no attribute 'group'

Thanks!

ValueError: x and y must have same first dimension

Hi again! I am making meta-plots using scripts adapted from your example 1 chip-seq analysis. This script executes fine and produces an average plot of 6 signal data sets in a window that is determined from a gff.db file. When I execute the script with this particular gff.db file I get an error at the first array.

ValueError Traceback (most recent call last)
/home/hart/BigData/hart/pro/Mnase/hmade/bass.py in ()
63 arrays['BF1'].mean(axis=0),
64 color='r',
---> 65 label='BF1')
66 ax.plot(
67 x,

My best guess is that these sites have very little or no signal associated with them. When I look at the NPZ file that is generated, it is the smallest one only 1.2Mb. Inside the NPY files are all 192.6Kb.
Do you know what is causing this behaviour? Thanks. -Keller.

Example 1 problem

Hi, I am trying to follow example 1 for metaseq (https://daler.github.io/metaseq/example_session.html) and am having problems with the example data.

I'm following the script by copying from the example page (or the jupyter example page) - it will not generate an example.features file (though the example.npz file is generated). Can you please help me?

code:
metaseq.persistence.save_features_and_arrays(
features=tsses,
arrays={'ip': ip_array, 'input': input_array},
prefix='example',
link_features=True,
overwrite=True)

output:
ln: example.features: No such file or directory

global name stats not defined

I get the error message global name stats not defined.

Traceback:

  /home/jraab/virtualenvs/base/local/lib/python2.7/site-packages/metaseq/plotutils.pyc in    ci(arr, conf)
    229     n = len(arr)
    230     se = arr.std(axis=0) / np.sqrt(n)
--> 231     h = se * stats.t._ppf((1 + conf) / 2., n - 1)
    232     return m, m - h, m + h
    233 

Following along with the example in the docs

    fig = metaseq.plotutils.imshow(
    # The array to plot
    y.values,

    # X-axis to use
    x=x,

    # Change the default figure size to something smaller for these docs
    figsize=(3, 7),

    # Make the colorbar limits go from 5th to 99th percentile.
    # `percentile=True` means treat vmin/vmax as percentiles rather than
    # actual values.
    vmin=5, vmax=99,  percentile=True,

    # Style for the average line plot
    line_kwargs=dict(color='k', label='All'),
)

Using my own array of ChIP signal, which I could plot an aggregate of via matplotlib as normal. I tried importing scipy.stats separately but I'm getting the same error.

Example1 problem#3 can't find metaseq

Hello,

I have installed metaseq with conda (conda install --channel bioconda metaseq-all) and I have downloaded the Example1.

Howver, when I run the cell where metaseq should be imported:

import metaseq

ip_signal = metaseq.genomic_signal(
    os.path.join(data_dir, 'wgEncodeHaibTfbsK562Atf3V0416101AlnRep1_chr17.bam'),
    'bam')

input_signal = metaseq.genomic_signal(
    os.path.join(data_dir, 'wgEncodeHaibTfbsK562RxlchV0416101AlnRep1_chr17.bam'),
    'bam')

I get this error message:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-10-62b03301eba5> in <module>
----> 1 import metaseq
      2 
      3 ip_signal = metaseq.genomic_signal(
      4     os.path.join(data_dir, 'wgEncodeHaibTfbsK562Atf3V0416101AlnRep1_chr17.bam'),
      5     'bam')

ModuleNotFoundError: No module named 'metaseq'

I don't know why because metaseq is installed in the conda environement.

Does anyone know what the problem is ?

sample snippet for the ChIP signal visualization?

I want to extract the TNNI3 gene for example, and plot the ChIPseq signal extending from specified upstream region to downstream region. And pile up several other plots including gene body etc. I am struggling to migrate my code to python for the future maintenance.

I read the script <Example 1>. It seems that I have to call other python package( in my case , I have to create a feature object for the TNNI3 gene) to complete the above task and no one-stop solution in the current metaseq package.

Anyway, any suggestions are appreciated. ( I prefer more python way, not using rpy2 as the bridge to call R/bioconductor.)

Incompatible with sklearn = 0.14.1

Hello,
I used pip install metaseq and have this install log:

Downloading/unpacking metaseq
  Downloading metaseq-0.1dev.tar.gz
  Running setup.py egg_info for package metaseq
    Downloading http://pypi.python.org/packages/source/d/distribute/distribute-0.6.14.tar.gz
    Extracting in /tmp/tmpt2tYbx
    Now working in /tmp/tmpt2tYbx/distribute-0.6.14
    Building a Distribute egg in /tmp/pip-build-obot/metaseq
    /tmp/pip-build-obot/metaseq/distribute-0.6.14-py2.7.egg

Requirement already satisfied (use --upgrade to upgrade): bx-python in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages (from metaseq)
Requirement already satisfied (use --upgrade to upgrade): numpy in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages (from metaseq)
Requirement already satisfied (use --upgrade to upgrade): HTSeq in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages (from metaseq)
Requirement already satisfied (use --upgrade to upgrade): matplotlib in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages (from metaseq)
Requirement already satisfied (use --upgrade to upgrade): scipy in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages (from metaseq)
Downloading/unpacking scikits.learn (from metaseq)
  Downloading scikits.learn-0.8.1.tar.gz (1.6MB): 1.6MB downloaded
  Running setup.py egg_info for package scikits.learn
    Warning: Assuming default configuration (scikits/learn/svm/tests/{setup_tests,setup}.py was not found)Appending scikits.learn.svm.tests configuration to scikits.learn.svm
    Ignoring attempt to set 'name' (from 'scikits.learn.svm' to 'scikits.learn.svm.tests')
    blas_opt_info:
    blas_mkl_info:
      libraries mkl,vml,guide not found in ['/nas3/yeolab/Software/Python-2.7.5/lib', '/usr/local/lib64', '/usr/local/lib', '/usr/lib64', '/usr/lib']
      NOT AVAILABLE

    atlas_blas_threads_info:
    Setting PTATLAS=ATLAS
      libraries ptf77blas,ptcblas,atlas not found in ['/nas3/yeolab/Software/Python-2.7.5/lib', '/usr/local/lib64', '/usr/local/lib', '/usr/lib64/sse2', '/usr/lib64', '/usr/lib']
      NOT AVAILABLE

    atlas_blas_info:
      libraries f77blas,cblas,atlas not found in ['/nas3/yeolab/Software/Python-2.7.5/lib', '/usr/local/lib64', '/usr/local/lib', '/usr/lib64/sse2', '/usr/lib64', '/usr/lib']
      NOT AVAILABLE

    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/distutils/system_info.py:1494: UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      warnings.warn(AtlasNotFoundError.__doc__)
    blas_info:
    Replacing _lib_names[0]=='blas' with 'fblas'
    Replacing _lib_names[0]=='fblas' with 'fblas'
      FOUND:
        libraries = ['fblas']
        library_dirs = ['/nas3/yeolab/Software/Python-Triton/BLAS']
        language = f77

      FOUND:
        libraries = ['fblas']
        library_dirs = ['/nas3/yeolab/Software/Python-Triton/BLAS']
        define_macros = [('NO_ATLAS_INFO', 1)]
        language = f77

    Warning: Assuming default configuration (scikits/learn/feature_extraction/{setup_feature_extraction,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/feature_extraction/tests/setup_feature_extraction/{setup_feature_extraction/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/cluster/tests/setup_cluster/{setup_cluster/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/covariance/{setup_covariance,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/covariance/tests/setup_covariance/{setup_covariance/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/decomposition/{setup_decomposition,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/decomposition/tests/setup_decomposition/{setup_decomposition/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/feature_selection/{setup_feature_selection,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/feature_selection/tests/setup_feature_selection/{setup_feature_selection/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/preprocessing/tests/{setup_tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/utils/tests/setup_utils/{setup_utils/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/externals/joblib/{setup_joblib,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/externals/joblib/test/setup_joblib/{setup_joblib/test,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/gaussian_process/{setup_gaussian_process,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/gaussian_process/tests/setup_gaussian_process/{setup_gaussian_process/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/metrics/{setup_metrics,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/metrics/tests/setup_metrics/{setup_metrics/tests,setup}.py was not found)Appending scikits.learn.svm.sparse configuration to scikits.learn.svm
    Ignoring attempt to set 'name' (from 'scikits.learn.svm' to 'scikits.learn.svm.sparse')
    Appending scikits.learn.svm configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.svm')
    Appending scikits.learn.datasets configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.datasets')
    Appending scikits.learn.feature_extraction configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_extraction')
    Appending scikits.learn.feature_extraction/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_extraction/tests')
    Appending scikits.learn.cluster configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.cluster')
    Appending scikits.learn.cluster/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.cluster/tests')
    Appending scikits.learn.covariance configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.covariance')
    Appending scikits.learn.covariance/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.covariance/tests')
    Appending scikits.learn.decomposition configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.decomposition')
    Appending scikits.learn.decomposition/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.decomposition/tests')
    Appending scikits.learn.feature_selection configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_selection')
    Appending scikits.learn.feature_selection/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_selection/tests')
    Appending scikits.learn.preprocessing.tests configuration to scikits.learn.preprocessing
    Ignoring attempt to set 'name' (from 'scikits.learn.preprocessing' to 'scikits.learn.preprocessing.tests')
    Appending scikits.learn.preprocessing.sparse configuration to scikits.learn.preprocessing
    Ignoring attempt to set 'name' (from 'scikits.learn.preprocessing' to 'scikits.learn.preprocessing.sparse')
    Appending scikits.learn.preprocessing configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.preprocessing')
    Appending scikits.learn.utils.sparsetools configuration to scikits.learn.utils
    Ignoring attempt to set 'name' (from 'scikits.learn.utils' to 'scikits.learn.utils.sparsetools')
    Appending scikits.learn.utils configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.utils')
    Appending scikits.learn.utils/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.utils/tests')
    Appending scikits.learn.externals.joblib configuration to scikits.learn.externals
    Ignoring attempt to set 'name' (from 'scikits.learn.externals' to 'scikits.learn.externals.joblib')
    Appending scikits.learn.externals.joblib/test configuration to scikits.learn.externals
    Ignoring attempt to set 'name' (from 'scikits.learn.externals' to 'scikits.learn.externals.joblib/test')
    Appending scikits.learn.externals configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.externals')
    Appending scikits.learn.gaussian_process configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.gaussian_process')
    Appending scikits.learn.gaussian_process/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.gaussian_process/tests')
    Appending scikits.learn.metrics configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.metrics')
    Appending scikitscikits/learn/setup.py:39: UserWarning:
        Blas (http://www.netlib.org/blas/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [blas]) or by setting
        the BLAS environment variable.
      warnings.warn(BlasNotFoundError.__doc__)
    Warning: Assuming default configuration (scikits/learn/linear_model/tests/{setup_tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/linear_model/sparse/tests/{setup_tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/tests/{setup_tests,setup}.py was not found)s.learn.metrics/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.metrics/tests')
    Appending scikits.learn.linear_model.tests configuration to scikits.learn.linear_model
    Ignoring attempt to set 'name' (from 'scikits.learn.linear_model' to 'scikits.learn.linear_model.tests')
    Appending scikits.learn.linear_model.sparse.tests configuration to scikits.learn.linear_model.sparse
    Ignoring attempt to set 'name' (from 'scikits.learn.linear_model.sparse' to 'scikits.learn.linear_model.sparse.tests')
    Appending scikits.learn.linear_model.sparse configuration to scikits.learn.linear_model
    Ignoring attempt to set 'name' (from 'scikits.learn.linear_model' to 'scikits.learn.linear_model.sparse')
    Appending scikits.learn.linear_model configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.linear_model')
    Appending scikits.learn.utils.sparsetools configuration to scikits.learn.utils
    Ignoring attempt to set 'name' (from 'scikits.learn.utils' to 'scikits.learn.utils.sparsetools')
    Appending scikits.learn.utils configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.utils')
    Appending scikits.learn.tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.tests')
    Appending scikits.learn configuration to
    Ignoring attempt to set 'name' (from '' to 'scikits.learn')
    build_src
    building library "libsvm-skl" sources
    building library "cblas" sources
    building extension "scikits.learn.svm.libsvm" sources
    building extension "scikits.learn.svm.liblinear" sources
    building extension "scikits.learn.svm.sparse.libsvm" sources
    building extension "scikits.learn.cluster._inertia" sources
    building extension "scikits.learn.preprocessing.sparse._preprocessing" sources
    building extension "scikits.learn.utils.sparsetools._csgraph" sources
    building extension "scikits.learn.utils.arrayfuncs" sources
    building extension "scikits.learn.ball_tree" sources
    building extension "scikits.learn.linear_model.cd_fast" sources
    building extension "scikits.learn.linear_model.sgd_fast" sources
    building extension "scikits.learn.linear_model.sgd_fast_sparse" sources
    building extension "scikits.learn.linear_model.sparse.cd_fast_sparse" sources
    building extension "scikits.learn.utils.sparsetools._csgraph" sources
    building extension "scikits.learn.utils.arrayfuncs" sources
    building data_files sources
    build_src: building npy-pkg config files

    warning: no files found matching 'test.py'
    warning: no files found matching '*.TXT' under directory 'scikits/learn/datasets'
Requirement already satisfied (use --upgrade to upgrade): pybedtools in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/pybedtools-0.6.2-py2.7-linux-x86_64.egg (from metaseq)
Requirement already satisfied (use --upgrade to upgrade): gffutils in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/gffutils-0.7-py2.7-linux-x86_64.egg (from metaseq)
Requirement already satisfied (use --upgrade to upgrade): cython in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages (from gffutils->metaseq)
Requirement already satisfied (use --upgrade to upgrade): argh in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/argh-0.23.2-py2.7.egg (from gffutils->metaseq)
Requirement already satisfied (use --upgrade to upgrade): argcomplete in /nas/nas0/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/argcomplete-0.5.4-py2.7.egg (from gffutils->metaseq)
/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/req.py:686: UserWarning: Module pkg_resources was already imported from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/pkg_resources.py, but /home/obot/.local/lib/python2.7/site-packages/distribute-0.6.14-py2.7.egg is being added to sys.path
  self.satisfied_by = pkg_resources.get_distribution(self.req)
/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/req.py:686: UserWarning: Module setuptools was already imported from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg/setuptools/__init__.py, but /home/obot/.local/lib/python2.7/site-packages/distribute-0.6.14-py2.7.egg is being added to sys.path
  self.satisfied_by = pkg_resources.get_distribution(self.req)
/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/pip-1.3.1-py2.7.egg/pip/req.py:686: UserWarning: Module site was already imported from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site.pyc, but /home/obot/.local/lib/python2.7/site-packages/distribute-0.6.14-py2.7.egg is being added to sys.path
  self.satisfied_by = pkg_resources.get_distribution(self.req)
Requirement already satisfied (use --upgrade to upgrade): distribute in /home/obot/.local/lib/python2.7/site-packages/distribute-0.6.14-py2.7.egg (from argcomplete->gffutils->metaseq)
Installing collected packages: metaseq, scikits.learn
  Running setup.py install for metaseq
    cythoning metaseq/rebin.pyx to metaseq/rebin.c
    building 'metaseq.rebin' extension
    gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c metaseq/rebin.c -o build/temp.linux-x86_64-2.7/metaseq/rebin.o
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from metaseq/rebin.c:314:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    metaseq/rebin.c: In function '__pyx_pf_7metaseq_5rebin_2rebin':
    metaseq/rebin.c:1946: warning: '__pyx_v_i2' may be used uninitialized in this function
    metaseq/rebin.c: In function '__pyx_pf_7metaseq_5rebin_float_rebin':
    metaseq/rebin.c:1352: warning: '__pyx_v_i2' may be used uninitialized in this function
    gcc -pthread -shared build/temp.linux-x86_64-2.7/metaseq/rebin.o -o build/lib.linux-x86_64-2.7/metaseq/rebin.so
    changing mode of build/scripts-2.7/download_metaseq_example_data.py from 664 to 775

    changing mode of /nas3/yeolab/Software/Python-2.7.5/bin/download_metaseq_example_data.py to 775
  Running setup.py install for scikits.learn
    Warning: Assuming default configuration (scikits/learn/svm/tests/{setup_tests,setup}.py was not found)Appending scikits.learn.svm.tests configuration to scikits.learn.svm
    Ignoring attempt to set 'name' (from 'scikits.learn.svm' to 'scikits.learn.svm.tests')
    blas_opt_info:
    blas_mkl_info:
      libraries mkl,vml,guide not found in ['/nas3/yeolab/Software/Python-2.7.5/lib', '/usr/local/lib64', '/usr/local/lib', '/usr/lib64', '/usr/lib']
      NOT AVAILABLE

    atlas_blas_threads_info:
    Setting PTATLAS=ATLAS
      libraries ptf77blas,ptcblas,atlas not found in ['/nas3/yeolab/Software/Python-2.7.5/lib', '/usr/local/lib64', '/usr/local/lib', '/usr/lib64/sse2', '/usr/lib64', '/usr/lib']
      NOT AVAILABLE

    atlas_blas_info:
      libraries f77blas,cblas,atlas not found in ['/nas3/yeolab/Software/Python-2.7.5/lib', '/usr/local/lib64', '/usr/local/lib', '/usr/lib64/sse2', '/usr/lib64', '/usr/lib']
      NOT AVAILABLE

    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/distutils/system_info.py:1494: UserWarning:
        Atlas (http://math-atlas.sourceforge.net/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [atlas]) or by setting
        the ATLAS environment variable.
      warnings.warn(AtlasNotFoundError.__doc__)
    blas_info:
    Replacing _lib_names[0]=='blas' with 'fblas'
    Replacing _lib_names[0]=='fblas' with 'fblas'
      FOUND:
        libraries = ['fblas']
        library_dirs = ['/nas3/yeolab/Software/Python-Triton/BLAS']
        language = f77

      FOUND:
        libraries = ['fblas']
        library_dirs = ['/nas3/yeolab/Software/Python-Triton/BLAS']
        define_macros = [('NO_ATLAS_INFO', 1)]
        language = f77

    Warning: Assuming default configuration (scikits/learn/feature_extraction/{setup_feature_extraction,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/feature_extraction/tests/setup_feature_extraction/{setup_feature_extraction/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/cluster/tests/setup_cluster/{setup_cluster/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/covariance/{setup_covariance,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/covariance/tests/setup_covariance/{setup_covariance/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/decomposition/{setup_decomposition,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/decomposition/tests/setup_decomposition/{setup_decomposition/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/feature_selection/{setup_feature_selection,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/feature_selection/tests/setup_feature_selection/{setup_feature_selection/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/preprocessing/tests/{setup_tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/utils/tests/setup_utils/{setup_utils/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/externals/joblib/{setup_joblib,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/externals/joblib/test/setup_joblib/{setup_joblib/test,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/gaussian_process/{setup_gaussian_process,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/gaussian_process/tests/setup_gaussian_process/{setup_gaussian_process/tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/metrics/{setup_metrics,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/metrics/tests/setup_metrics/{setup_metrics/tests,setup}.py was not found)Appending scikits.learn.svm.sparse configuration to scikits.learn.svm
    Ignoring attempt to set 'name' (from 'scikits.learn.svm' to 'scikits.learn.svm.sparse')
    Appending scikits.learn.svm configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.svm')
    Appending scikits.learn.datasets configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.datasets')
    Appending scikits.learn.feature_extraction configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_extraction')
    Appending scikits.learn.feature_extraction/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_extraction/tests')
    Appending scikits.learn.cluster configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.cluster')
    Appending scikits.learn.cluster/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.cluster/tests')
    Appending scikits.learn.covariance configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.covariance')
    Appending scikits.learn.covariance/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.covariance/tests')
    Appending scikits.learn.decomposition configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.decomposition')
    Appending scikits.learn.decomposition/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.decomposition/tests')
    Appending scikits.learn.feature_selection configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_selection')
    Appending scikits.learn.feature_selection/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.feature_selection/tests')
    Appending scikits.learn.preprocessing.tests configuration to scikits.learn.preprocessing
    Ignoring attempt to set 'name' (from 'scikits.learn.preprocessing' to 'scikits.learn.preprocessing.tests')
    Appending scikits.learn.preprocessing.sparse configuration to scikits.learn.preprocessing
    Ignoring attempt to set 'name' (from 'scikits.learn.preprocessing' to 'scikits.learn.preprocessing.sparse')
    Appending scikits.learn.preprocessing configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.preprocessing')
    Appending scikits.learn.utils.sparsetools configuration to scikits.learn.utils
    Ignoring attempt to set 'name' (from 'scikits.learn.utils' to 'scikits.learn.utils.sparsetools')
    Appending scikits.learn.utils configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.utils')
    Appending scikits.learn.utils/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.utils/tests')
    Appending scikits.learn.externals.joblib configuration to scikits.learn.externals
    Ignoring attempt to set 'name' (from 'scikits.learn.externals' to 'scikits.learn.externals.joblib')
    Appending scikits.learn.externals.joblib/test configuration to scikits.learn.externals
    Ignoring attempt to set 'name' (from 'scikits.learn.externals' to 'scikits.learn.externals.joblib/test')
    Appending scikits.learn.externals configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.externals')
    Appending scikits.learn.gaussian_process configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.gaussian_process')
    Appending scikits.learn.gaussian_process/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.gaussian_process/tests')
    Appending scikits.learn.metrics configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.metrics')
    Appending scikitscikits/learn/setup.py:39: UserWarning:
        Blas (http://www.netlib.org/blas/) libraries not found.
        Directories to search for the libraries can be specified in the
        numpy/distutils/site.cfg file (section [blas]) or by setting
        the BLAS environment variable.
      warnings.warn(BlasNotFoundError.__doc__)
    Warning: Assuming default configuration (scikits/learn/linear_model/tests/{setup_tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/linear_model/sparse/tests/{setup_tests,setup}.py was not found)Warning: Assuming default configuration (scikits/learn/tests/{setup_tests,setup}.py was not found)s.learn.metrics/tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.metrics/tests')
    Appending scikits.learn.linear_model.tests configuration to scikits.learn.linear_model
    Ignoring attempt to set 'name' (from 'scikits.learn.linear_model' to 'scikits.learn.linear_model.tests')
    Appending scikits.learn.linear_model.sparse.tests configuration to scikits.learn.linear_model.sparse
    Ignoring attempt to set 'name' (from 'scikits.learn.linear_model.sparse' to 'scikits.learn.linear_model.sparse.tests')
    Appending scikits.learn.linear_model.sparse configuration to scikits.learn.linear_model
    Ignoring attempt to set 'name' (from 'scikits.learn.linear_model' to 'scikits.learn.linear_model.sparse')
    Appending scikits.learn.linear_model configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.linear_model')
    Appending scikits.learn.utils.sparsetools configuration to scikits.learn.utils
    Ignoring attempt to set 'name' (from 'scikits.learn.utils' to 'scikits.learn.utils.sparsetools')
    Appending scikits.learn.utils configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.utils')
    Appending scikits.learn.tests configuration to scikits.learn
    Ignoring attempt to set 'name' (from 'scikits.learn' to 'scikits.learn.tests')
    Appending scikits.learn configuration to
    Ignoring attempt to set 'name' (from '' to 'scikits.learn')
    unifing config_cc, config, build_clib, build_ext, build commands --compiler options
    unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options
    build_src
    building library "libsvm-skl" sources
    building library "cblas" sources
    building extension "scikits.learn.svm.libsvm" sources
    building extension "scikits.learn.svm.liblinear" sources
    building extension "scikits.learn.svm.sparse.libsvm" sources
    building extension "scikits.learn.cluster._inertia" sources
    building extension "scikits.learn.preprocessing.sparse._preprocessing" sources
    building extension "scikits.learn.utils.sparsetools._csgraph" sources
    building extension "scikits.learn.utils.arrayfuncs" sources
    building extension "scikits.learn.ball_tree" sources
    building extension "scikits.learn.linear_model.cd_fast" sources
    building extension "scikits.learn.linear_model.sgd_fast" sources
    building extension "scikits.learn.linear_model.sgd_fast_sparse" sources
    building extension "scikits.learn.linear_model.sparse.cd_fast_sparse" sources
    building extension "scikits.learn.utils.sparsetools._csgraph" sources
    building extension "scikits.learn.utils.arrayfuncs" sources
    building data_files sources
    build_src: building npy-pkg config files
    customize UnixCCompiler
    customize UnixCCompiler using build_clib
    building 'libsvm-skl' library
    compiling C++ sources
    C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -c'
    g++: scikits/learn/svm/src/libsvm/libsvm_template.cpp
    ar: adding 1 object files to build/temp.linux-x86_64-2.7/liblibsvm-skl.a
    building 'cblas' library
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -c'
    gcc: scikits/learn/src/cblas/ATL_sreftrsvUNU.c
    gcc: scikits/learn/src/cblas/ATL_sreftrsvLNN.c
    gcc: scikits/learn/src/cblas/ATL_sreftrsvLTN.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsvLNN.c
    gcc: scikits/learn/src/cblas/ATL_sreftrsvLTU.c
    gcc: scikits/learn/src/cblas/cblas_dnrm2.c
    gcc: scikits/learn/src/cblas/ATL_srefrotg.c
    gcc: scikits/learn/src/cblas/cblas_xerbla.c
    scikits/learn/src/cblas/cblas_xerbla.c: In function 'cblas_xerbla':
    scikits/learn/src/cblas/cblas_xerbla.c:52: warning: implicit declaration of function 'exit'
    scikits/learn/src/cblas/cblas_xerbla.c:52: warning: incompatible implicit declaration of built-in function 'exit'
    gcc: scikits/learn/src/cblas/ATL_dreftrsvLTN.c
    gcc: scikits/learn/src/cblas/cblas_drotg.c
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_drefcopy.c
    gcc: scikits/learn/src/cblas/cblas_srotg.c
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_sreftrsvLNU.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsvLTU.c
    gcc: scikits/learn/src/cblas/cblas_strsv.c
    scikits/learn/src/cblas/cblas_strsv.c: In function 'cblas_strsv':
    scikits/learn/src/cblas/cblas_strsv.c:73: warning: implicit declaration of function 'cblas_xerbla'
    scikits/learn/src/cblas/cblas_strsv.c: At top level:
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_srefcopy.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsvUNU.c
    gcc: scikits/learn/src/cblas/ATL_sreftrsvUTU.c
    gcc: scikits/learn/src/cblas/cblas_drot.c
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_dreftrsvUNN.c
    gcc: scikits/learn/src/cblas/cblas_srot.c
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_sreftrsv.c
    gcc: scikits/learn/src/cblas/ATL_sreftrsvUTN.c
    gcc: scikits/learn/src/cblas/cblas_dtrsv.c
    scikits/learn/src/cblas/cblas_dtrsv.c: In function 'cblas_dtrsv':
    scikits/learn/src/cblas/cblas_dtrsv.c:73: warning: implicit declaration of function 'cblas_xerbla'
    scikits/learn/src/cblas/cblas_dtrsv.c: At top level:
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_drefrotg.c
    gcc: scikits/learn/src/cblas/ATL_srefrot.c
    gcc: scikits/learn/src/cblas/cblas_errprn.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsvLNU.c
    gcc: scikits/learn/src/cblas/ATL_sreftrsvUNN.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsvUTN.c
    gcc: scikits/learn/src/cblas/cblas_dscal.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsv.c
    gcc: scikits/learn/src/cblas/ATL_dreftrsvUTU.c
    gcc: scikits/learn/src/cblas/cblas_scopy.c
    scikits/learn/src/cblas/cblas_scopy.c: In function 'cblas_scopy':
    scikits/learn/src/cblas/cblas_scopy.c:46: warning: implicit declaration of function 'ATL_srefcopy'
    scikits/learn/src/cblas/cblas_scopy.c: At top level:
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/cblas_ddot.c
    gcc: scikits/learn/src/cblas/cblas_daxpy.c
    gcc: scikits/learn/src/cblas/cblas_dcopy.c
    scikits/learn/src/cblas/cblas_dcopy.c: In function 'cblas_dcopy':
    scikits/learn/src/cblas/cblas_dcopy.c:46: warning: implicit declaration of function 'ATL_drefcopy'
    scikits/learn/src/cblas/cblas_dcopy.c: At top level:
    scikits/learn/src/cblas/atlas_misc.h:398: warning: 'ATL_AlignOffset' defined but not used
    scikits/learn/src/cblas/atlas_misc.h:415: warning: 'ATL_Align2Ptr' defined but not used
    gcc: scikits/learn/src/cblas/ATL_drefrot.c
    ar: adding 38 object files to build/temp.linux-x86_64-2.7/libcblas.a
    customize UnixCCompiler
    customize UnixCCompiler using build_ext
    resetting extension 'scikits.learn.svm.liblinear' language from 'f77' to 'c++'.
    customize UnixCCompiler
    customize UnixCCompiler using build_ext
    customize Gnu95FCompiler
    Found executable /usr/bin/gfortran
    customize Gnu95FCompiler
    customize Gnu95FCompiler using build_ext
    building 'scikits.learn.svm.libsvm' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -Iscikits/learn/svm/src/libsvm -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/svm/libsvm.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/svm/libsvm.c:158:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    scikits/learn/svm/libsvm.c: In function '__pyx_pf_7scikits_5learn_3svm_6libsvm_fit':
    scikits/learn/svm/libsvm.c:1524: warning: assignment discards qualifiers from pointer target type
    scikits/learn/svm/libsvm.c: In function '__pyx_pf_7scikits_5learn_3svm_6libsvm_cross_validation':
    scikits/learn/svm/libsvm.c:5769: warning: assignment discards qualifiers from pointer target type
    scikits/learn/svm/libsvm.c: At top level:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/svm/libsvm.c:9410: warning: '__Pyx_UnpackItem' defined but not used
    scikits/learn/svm/libsvm.c:9420: warning: '__Pyx_EndUnpack' defined but not used
    g++ -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/svm/libsvm.o -Lbuild/temp.linux-x86_64-2.7 -llibsvm-skl -o build/lib.linux-x86_64-2.7/scikits/learn/svm/libsvm.so
    building 'scikits.learn.svm.liblinear' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-DNO_ATLAS_INFO=1 -Iscikits/learn/svm/src -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/svm/liblinear.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/svm/liblinear.c:158:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    scikits/learn/svm/liblinear.c: In function '__pyx_pf_7scikits_5learn_3svm_9liblinear_train_wrap':
    scikits/learn/svm/liblinear.c:1034: warning: assignment discards qualifiers from pointer target type
    scikits/learn/svm/liblinear.c: In function '__pyx_pf_7scikits_5learn_3svm_9liblinear_csr_train_wrap':
    scikits/learn/svm/liblinear.c:1720: warning: assignment discards qualifiers from pointer target type
    scikits/learn/svm/liblinear.c: At top level:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/svm/liblinear.c:7600: warning: '__Pyx_UnpackItem' defined but not used
    scikits/learn/svm/liblinear.c:7610: warning: '__Pyx_EndUnpack' defined but not used
    compiling C++ sources
    C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC

    compile options: '-DNO_ATLAS_INFO=1 -Iscikits/learn/svm/src -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    g++: scikits/learn/svm/src/liblinear/tron.cpp
    g++: scikits/learn/svm/src/liblinear/linear.cpp
    scikits/learn/svm/src/liblinear/linear.cpp: In function 'void train_one(const problem*, const parameter*, double*, double, double)':
    scikits/learn/svm/src/liblinear/linear.cpp:1099: warning: 'loss_old' may be used uninitialized in this function
    scikits/learn/svm/src/liblinear/linear.cpp:1097: warning: 'Gmax_init' may be used uninitialized in this function
    scikits/learn/svm/src/liblinear/linear.cpp:1379: warning: 'Gmax_init' may be used uninitialized in this function
    g++ -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/svm/liblinear.o build/temp.linux-x86_64-2.7/scikits/learn/svm/src/liblinear/tron.o build/temp.linux-x86_64-2.7/scikits/learn/svm/src/liblinear/linear.o -L/nas3/yeolab/Software/Python-Triton/BLAS -Lbuild/temp.linux-x86_64-2.7 -lfblas -o build/lib.linux-x86_64-2.7/scikits/learn/svm/liblinear.so
    building 'scikits.learn.svm.sparse.libsvm' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -Iscikits/learn/svm/src/libsvm -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/svm/sparse/libsvm.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/svm/sparse/libsvm.c:158:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    scikits/learn/svm/sparse/libsvm.c: In function '__pyx_pf_7scikits_5learn_3svm_6sparse_6libsvm_libsvm_sparse_train':
    scikits/learn/svm/sparse/libsvm.c:1510: warning: assignment discards qualifiers from pointer target type
    scikits/learn/svm/sparse/libsvm.c: At top level:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/svm/sparse/libsvm.c:5821: warning: '__Pyx_UnpackItem' defined but not used
    scikits/learn/svm/sparse/libsvm.c:5831: warning: '__Pyx_EndUnpack' defined but not used
    g++ -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/svm/sparse/libsvm.o -Lbuild/temp.linux-x86_64-2.7 -llibsvm-skl -o build/lib.linux-x86_64-2.7/scikits/learn/svm/sparse/libsvm.so
    building 'scikits.learn.cluster._inertia' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/cluster/_inertia.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/cluster/_inertia.c:158:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/cluster/_inertia.c:3796: warning: '__Pyx_UnpackItem' defined but not used
    scikits/learn/cluster/_inertia.c:3806: warning: '__Pyx_EndUnpack' defined but not used
    gcc -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/cluster/_inertia.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/cluster/_inertia.so
    building 'scikits.learn.preprocessing.sparse._preprocessing' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/preprocessing/sparse/src/_preprocessing.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/preprocessing/sparse/src/_preprocessing.c:225:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    gcc -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/preprocessing/sparse/src/_preprocessing.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/preprocessing/sparse/_preprocessing.so
    building 'scikits.learn.utils.sparsetools._csgraph' extension
    compiling C++ sources
    C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC

    compile options: '-D__STDC_FORMAT_MACROS=1 -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    g++: scikits/learn/utils/sparsetools/csgraph_wrap.cxx
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from scikits/learn/utils/sparsetools/npy_3kcompat.h:23,
                     from scikits/learn/utils/sparsetools/py3k.h:23,
                     from scikits/learn/utils/sparsetools/csgraph_wrap.cxx:2821:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    scikits/learn/utils/sparsetools/npy_3kcompat.h:391: warning: 'void simple_capsule_dtor(void*)' defined but not used
    g++ -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/utils/sparsetools/csgraph_wrap.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/utils/sparsetools/_csgraph.so
    building 'scikits.learn.utils.arrayfuncs' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-DNO_ATLAS_INFO=1 -Iscikits/learn/src/cblas -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/utils/arrayfuncs.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/utils/arrayfuncs.c:158:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/utils/arrayfuncs.c:3728: warning: '__Pyx_UnpackItem' defined but not used
    scikits/learn/utils/arrayfuncs.c:3738: warning: '__Pyx_EndUnpack' defined but not used
    scikits/learn/utils/arrayfuncs.c:1441: warning: '__pyx_pf_5numpy_7ndarray___getbuffer__' defined but not used
    scikits/learn/utils/arrayfuncs.c:2246: warning: '__pyx_pf_5numpy_7ndarray___releasebuffer__' defined but not used
    /usr/bin/gfortran -Wall -Wall -shared build/temp.linux-x86_64-2.7/scikits/learn/utils/arrayfuncs.o -L/nas3/yeolab/Software/Python-Triton/BLAS -Lbuild/temp.linux-x86_64-2.7 -lcblas -lgfortran -o build/lib.linux-x86_64-2.7/scikits/learn/utils/arrayfuncs.so
    building 'scikits.learn.ball_tree' extension
    compiling C++ sources
    C compiler: g++ -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    g++: scikits/learn/src/ball_tree.cpp
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/src/ball_tree.cpp:226:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1594: warning: 'int _import_array()' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:236: warning: 'int _import_umath()' defined but not used
    g++ -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/src/ball_tree.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/ball_tree.so
    building 'scikits.learn.linear_model.cd_fast' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-DNO_ATLAS_INFO=1 -Iscikits/learn/src/cblas -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/linear_model/cd_fast.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/linear_model/cd_fast.c:201:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    /usr/bin/gfortran -Wall -Wall -shared build/temp.linux-x86_64-2.7/scikits/learn/linear_model/cd_fast.o -L/nas3/yeolab/Software/Python-Triton/BLAS -Lbuild/temp.linux-x86_64-2.7 -lcblas -lgfortran -o build/lib.linux-x86_64-2.7/scikits/learn/linear_model/cd_fast.so
    building 'scikits.learn.linear_model.sgd_fast' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/linear_model/sgd_fast.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/linear_model/sgd_fast.c:225:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/linear_model/sgd_fast.c: In function '__pyx_pf_7scikits_5learn_12linear_model_8sgd_fast_plain_sgd':
    scikits/learn/linear_model/sgd_fast.c:4342: warning: '__pyx_v_q_data_ptr' may be used uninitialized in this function
    gcc -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/linear_model/sgd_fast.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/linear_model/sgd_fast.so
    building 'scikits.learn.linear_model.sgd_fast_sparse' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/linear_model/sgd_fast_sparse.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/linear_model/sgd_fast_sparse.c:225:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/linear_model/sgd_fast_sparse.c: In function '__pyx_pf_7scikits_5learn_12linear_model_15sgd_fast_sparse_plain_sgd':
    scikits/learn/linear_model/sgd_fast_sparse.c:1238: warning: '__pyx_v_q_data_ptr' may be used uninitialized in this function
    gcc -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/linear_model/sgd_fast_sparse.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/linear_model/sgd_fast_sparse.so
    building 'scikits.learn.linear_model.sparse.cd_fast_sparse' extension
    compiling C sources
    C compiler: gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC

    compile options: '-I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include -I/nas3/yeolab/Software/Python-2.7.5/include/python2.7 -c'
    gcc: scikits/learn/linear_model/sparse/src/cd_fast_sparse.c
    In file included from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarraytypes.h:1728,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/ndarrayobject.h:17,
                     from /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/arrayobject.h:15,
                     from scikits/learn/linear_model/sparse/src/cd_fast_sparse.c:158:
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/npy_deprecated_api.h:11:2: warning: #warning "Using deprecated NumPy API, disable it by #defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION"
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__multiarray_api.h:1595: warning: '_import_array' defined but not used
    /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/numpy/core/include/numpy/__ufunc_api.h:237: warning: '_import_umath' defined but not used
    scikits/learn/linear_model/sparse/src/cd_fast_sparse.c:5077: warning: '__Pyx_UnpackItem' defined but not used
    scikits/learn/linear_model/sparse/src/cd_fast_sparse.c:5087: warning: '__Pyx_EndUnpack' defined but not used
    gcc -pthread -shared build/temp.linux-x86_64-2.7/scikits/learn/linear_model/sparse/src/cd_fast_sparse.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/scikits/learn/linear_model/sparse/cd_fast_sparse.so

    warning: no files found matching 'test.py'
    warning: no files found matching '*.TXT' under directory 'scikits/learn/datasets'
    Installing /nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/scikits.learn-0.8.1-py2.7-nspkg.pth
Successfully installed metaseq scikits.learn
Cleaning up...

But when I tried to import it, I get:

import metaseq

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-5-3365faa9d779> in <module>()
----> 1 import metaseq

/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/metaseq/__init__.py in <module>()
      5         gfffeature_to_interval
      6 from genomic_signal import genomic_signal
----> 7 import plotutils
      8 import integration

/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/metaseq/plotutils.py in <module>()
      7 import numpy as np
      8 from scipy import stats
----> 9 from scikits.statsmodels.sandbox.stats.multicomp import fdrcorrection0
     10 
     11 

ImportError: No module named sandbox.stats.multicomp

/nas3/yeolab/Software/Python-2.7.5/lib/python2.7/site-packages/scikits/statsmodels/__init__.py:2: UserWarning: scikits.statsmodels namespace is deprecated and will be removed in 0.5, please use statsmodels instead
  warnings.warn('scikits.statsmodels namespace is deprecated and will be '

For the record, I have sklearn 0.14.1:

In [11]:  sklearn.__version__
Out[11]: '0.14.1'

Too many files open

I'm getting this error, on the latest version - 0.5.2, when trying to run:

tsses = pybedtools.BedTool('/ucsc/Mus_musculus/UCSC/mm9/Annotation/Genes/genes.gtf')
input = metaseq.genomic_signal('input.bed', 'bed)
data = input.array(tsses, bins=100, processes=12)

Things to check:

Too many files open -- please submit a bug report so that this can be fixed
<type 'exceptions.OSError'>: Too many open files
The command was:

bedtools sort -i long_path_here.bed

$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 257234
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 257234
virtual memory (kbytes, -v) 29360128
file locks (-x) unlimited

Issue with Example 1

I am working through the iPython Notebook and am getting an error related to pybedtools. I checked and tsses.gtf is indeed generated from the previous step.

tsses_1kb = tsses.slop(b=1000, genome='hg19', output='tsses-1kb.gtf')

---------------------------------------------------------------------------
NotImplementedError                       Traceback (most recent call last)
<ipython-input-32-7282d1e13f96> in <module>()
----> 1 tsses_1kb = tsses.slop(b=1000, genome='hg19', output='tsses-1kb.gtf')

/usr/local/lib/python2.7/dist-packages/pybedtools/bedtool.pyc in decorated(self, *args, **kwargs)
    771             # this calls the actual method in the first place; *result* is
    772             # whatever you get back
--> 773             result = method(self, *args, **kwargs)
    774 
    775             # add appropriate tags

/usr/local/lib/python2.7/dist-packages/pybedtools/bedtool.pyc in not_implemented_func(*args, **kwargs)
    201         if not_implemented:
    202             def not_implemented_func(*args, **kwargs):
--> 203                 raise NotImplementedError(help_str)
    204             return not_implemented_func
    205 

NotImplementedError: "slopBed" does not appear to be installed or on the path, so this method is disabled.  Please install a more recent version of BEDTools and re-import to use this method.

I tried to pip install --update pybedtools but I already have the latest version installed. Also installed bedtools 2.17.0-1 but I think the above error is asking me to update pybedtools (?).

MetaSeq on non-traditional files that have already been binned?

I have written a new fast medium/diffuse area chip-seq caller in Python. It uses a binning strategy and one of the files it produces contains the island counts:

Chromosome Bin Chip1 Chip2 Input1 Input2
chr1 11550 3 6 1 2
chr1 11600 4 5 0 2
...

I'd love to be able to make it easy to use meta-seq with my chip-seq caller to produce various graphs. With the data above I could regenerate four bed files* so that using meta-seq would be straightforward, but that would be needlessly time-consuming. Therefore I'll look into creating an adapter for data like above.

What I want is a file adapter that iterates over the cols of the dataframe and returns one signal object for each.

What does the genomic_signal objects look like? I imagine it is pretty similar to my table above...
I could not find them in the test suite: https://github.com/daler/metaseq/blob/master/metaseq/test/test.py

I guess I could play around metaseq with for half a day and find out, but I am not used to reading OO code so it would be hard for me. Would love some comments on strategy for/ feasibility of creating genomic_signal objects for data like above, if you find have the time.

*For sample 1 I'd make a bed with the following:

chr1 11550 11600
chr1 11550 11600
chr1 11550 11600
chr1 11550 11600
chr1 11600 11650
...

cannot import name 'data_dir' from 'helpers'

Hi,
I tried to import metaseq and got the error No module named 'helpers'. I then pip install helpers and import metaseq again. Then I got the error "ImportError: cannot import name 'data_dir' from 'helpers'".

Thanks!

Syntax help

I am trying to work through Example 1 using the provided metaseq environment. Any idea why I am getting an error? I tried adding spaces before and after the offending '=' but the ^ still points to the x in prefix.

if not os.path.exists('example.npz'):
... ip_array = ip_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
... input_array = input_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
... ip_array /= ip_signal.mapped_read_count() / 1e6
... input_array /= input_signal.mapped_read_count() / 1e6
... metaseq.persistence.save_features_and_arrays(
... features=tsses,
... arrays={'ip': ip_array, 'input': input_array}
... prefix='example',
File "", line 15
prefix='example',
^
SyntaxError: invalid syntax

Splitting arrays

I have been enjoying using your program, thank you for all the great work and support.

I have made some heat maps of MNase reads centred around the binding site of my favourite TF. I then sorted the sights based on density of reads. I want to investigate the "enriched" sites. Is there a way to split the array (in quartiles perhaps) and group the sorted sites into new gff files? Any advice would be greatly appreciated! -Keller

Segmentation fault

I tried installing metaseq with anaconda (I have python 2). I then tried importing it with python 2 and got a segmentation fault. I have re-done this using gcc versions 4.8, 4.9, and 5.4 and gotten a segmentation fault every time. Is there a way that I can fix this so that I can use metaseq? Thanks so much!

use of bedGraphs in genomic_signal

In the original 2014 publication, it says that we can obtain aggregated read density over features of interested using bedGraphs. For strand-specific data (e.g. nascent) it would be nice to be able to use bedGraphs in the genomic_signal function as they're easier to concatenate pos/neg strands (as opposed to bigWigs where you're forced to separate based on strand due to the restriction on overlapping coordinates).

Is there a way to do this currently that I've missed? If not, I guess this would be a feature request.

Thanks!

A problem when running the script

Hi daler, I recently used metaseq to draw tss-plot, but I am stuck in the function of 'ip_signal.array() and input_signal.array ()'. When I run the script it comes up with the error below. I use canopy and python version is 2.7.9, would help me with than problem? thanks

TypeError Traceback (most recent call last)
in ()
14
15 # Use multiple CPUs. Dramatically speeds up run time.
---> 16 processes=processes)
17
18 # Do the same thing for input.

/home/zluna/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/metaseq-0.5.5.4-py2.7.egg/metaseq/_genomic_signal.pyc in array(self, features, processes, chunksize, ragged, **kwargs)
120 arrays = _array_parallel(
121 self.fn, self.class, features, processes=processes,
--> 122 chunksize=chunksize, *_kwargs)
123 else:
124 arrays = _array(self.fn, self.class, features, *_kwargs)

/home/zluna/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/metaseq-0.5.5.4-py2.7.egg/metaseq/array_helpers.pyc in _array_parallel(fn, cls, genelist, chunksize, processes, **kwargs)
381 itertools.repeat(cls),
382 chunks,
--> 383 itertools.repeat(kwargs)))
384 pool.close()
385 pool.join()

/home/zluna/Canopy/appdata/canopy-1.5.5.3123.rh5-x86_64/lib/python2.7/multiprocessing/pool.pyc in map(self, func, iterable, chunksize)
249 '''
250 assert self._state == RUN
--> 251 return self.map_async(func, iterable, chunksize).get()
252
253 def imap(self, func, iterable, chunksize=1):

/home/zluna/Canopy/appdata/canopy-1.5.5.3123.rh5-x86_64/lib/python2.7/multiprocessing/pool.pyc in get(self, timeout)
556 return self._value
557 else:
--> 558 raise self._value
559
560 def _set(self, i, obj):

TypeError: Expected bytes, got unicode

How is missing data interpreted?

How is missing signal data in bigwig file treated? Are these intervals treated as zeros or ignored? Is there an argument for how this is treated in the array() method?

multiprocessing

Hi again! I was hoping you might be able to trace the root issue of my problem. I am trying to plot a heatmap using a gff file for signal (this file is large 9.1Mb). I tried running it on my local machine but kept getting errors like this:
NotifierThreadProc: could not create trigger pipe
NotifierThreadProc: could not create trigger pipe
NotifierThreadProc: could not create trigger pipe
NotifierThreadProc: could not create trigger pipe
NotifierThreadProc: could not create trigger pipe
Traceback (most recent call last):
File "heatmo.py", line 18, in
Bkornabf_signal = Bkornabf_signal.array(sites_1kb, bins=100, processes=3)
File "/usr/local/lib/python2.7/dist-packages/metaseq/_genomic_signal.py", line 122, in array
chunksize=chunksize, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/metaseq/array_helpers.py", line 383, in _array_parallel
itertools.repeat(kwargs)))
File "/usr/lib/python2.7/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/usr/lib/python2.7/multiprocessing/pool.py", line 558, in get
raise self._value
KeyError: "unknown preset 'empty', valid presets are 'vcf,sam,bed,psltbl,pileup,gff'"

After inspecting the genomic_signal script and converting my gff to bed I kept getting similar errors. So I moved my data to a bigger machine with 16 processors. This seemed to accommodate the "triggered pipe" but I get the following message:

/home/keller/anaconda2/lib/python2.7/site-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')
[E::bgzf_flush] hwrite error (wrong size)
Traceback (most recent call last):
File "heatmo.py", line 20, in
Bkornabf_signal = Bkornabf_signal.array(sites_1kb, bins=100, processes=processes)
File "/home/keller/anaconda2/lib/python2.7/site-packages/metaseq/_genomic_signal.py", line 122, in array
chunksize=chunksize, **kwargs)
File "/home/keller/anaconda2/lib/python2.7/site-packages/metaseq/array_helpers.py", line 383, in _array_parallel
itertools.repeat(kwargs)))
File "/home/keller/anaconda2/lib/python2.7/multiprocessing/pool.py", line 251, in map
return self.map_async(func, iterable, chunksize).get()
File "/home/keller/anaconda2/lib/python2.7/multiprocessing/pool.py", line 567, in get
raise self._value
OSError: writing failed

SignalMiniBrowser is not defined

Thanks for your reply to the db issue ffrom gtf files for mm9.
Actually finallyi want to make tss plots and heatmap from my bam files.
so I created the db object

But for basic plotting checking .
All my files bam and db are in working directory "Downloads"

cd Downloads
python

import metaseq
from matplotlib import pyplot as plt
from pybedtools.contrib.plotting import Track
import pybedtools
G = gffutils.FeatureDB(
... 'Mus_musculus.NCBIM37.64.gtf.db')
ip = metaseq.genomic_signal('galaxy226.bam')
inp = metaseq.genomic_signal('galaxy58.bam')
plotting_kwargs = [
... dict(color='r', label='IP'),
... dict(color='k',linestyle=':', label='input')]
local_coverage_kwargs = dict(fragment_size=200)
b = SignalMiniBrowser([ip, inp],
... plotting_kwargs=plotting_kwargs,
... local_coverage_kwargs=local_coverage_kwargs)
Traceback (most recent call last):
File "", line 1, in
NameError: name 'SignalMiniBrowser' is not defined

help with the commands will be very helpful.(minibrowser.py file is presen tin metaseq folder)

suman

py3: relative imports fail

(Conda gives a conflicting requirements error, so I installed metaseq with pip.)

When trying to import metaseq, I get the following error:

In [2]: import metaseq
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-2-3365faa9d779> in <module>()
----> 1 import metaseq

/local/home/endrebak/anaconda3/lib/python3.5/site-packages/metaseq/__init__.py in <module>()
      2 import sys
      3 import time
----> 4 import helpers
      5 from helpers import data_dir, example_filename
      6 from _genomic_signal import genomic_signal

ImportError: No module named 'helpers'

missing dependencies

somewhere in the dependency chain pysam and statsmodels are missing and fail on import

clustering heatmap and identifying genes

Hi Ryan,

Well, I have been using Metaseq for a week now. I mainly have called peaks of some ChIP-Seq datasets. I convert my BED files into BAM and then heatmaps for +/- 5kb TSS
I have an array of gene ids and I wish to know whether these genes cluster together in the heatmap drawn using metaseq

@ps: Amazing tool and I hope it gets even better

subplots using metaseq.plotutils.imshow

More of a question than an issue. How would you place two sub-plots side by side for example one of the tutorial for this plot:

fig = metaseq.plotutils.imshow(
    
    # These are the same arguments as above.
    normalized_subtracted,
    x=x,
    figsize=(3, 7),
    vmin=5, vmax=99,  percentile=True,
    line_kwargs=dict(color='k', label='All'),
    fill_kwargs=dict(color='k', alpha=0.3),
    
    # This is new: sort by mean signal
    sort_by=normalized_subtracted.mean(axis=1)
)

I am having allot of difficulty assembling the two plots produced by this function into subplots side by side. Any suggestions would be appreciated.

hg19 Random Chromosome Issues

Hi,

I'm wondering if there's a way to deal with the random chromosomes that hg19 includes not in their chromosome, ie. "chr9_gl000199_random" and "chr6_cox_hap2". I'm having issues in that metaseq can't handle those reads when they are in the TSS data set or in the peaks data set I am trying to use, and returns error messages until I remove them. Is there a better option that simply removing this data from my ranges that metaseq is handling, as they are obligatory members of the hg19 build?

Thanks!
John

Tutorial ambiguity

I am relatively new to python and am having trouble following the tutorial provided. In bash I load the (metaseq-test) environment, downloaded the example data to ~/metaseq-example/data. At this point the tutorial confueses me. I can set the variable data_dir = 'metaseq-example/data' if I switch to python, but then commands like "ls" and "head" don't work. If you switch to bash, it doesn't know the python variables, and you cannot set them in bash like you show in the tutorial.

Sorry to complain about what I am sure is an elementary issue, but I feel the tutorial could be a bit more clear in this regard. Thank you! -Keller

install issues

i was attempting to check this out and think it would be really useful, but i'm having issues installing.

$ git clone git://github.com/daler/metaseq.git
Cloning into 'metaseq'...
remote: Counting objects: 471, done.
remote: Compressing objects: 100% (207/207), done.
remote: Total 471 (delta 257), reused 468 (delta 254)
Receiving objects: 100% (471/471), 2.81 MiB | 1.34 MiB/s, done.
Resolving deltas: 100% (257/257), done.

$ cd metaseq/
total 48
-rw-r--r--   1 brownj  staff   2.1K Nov  5 09:40 README.rst
drwxr-xr-x   4 brownj  staff   136B Nov  5 09:40 doc/
-rw-r--r--   1 brownj  staff    10K Nov  5 09:40 ez_setup.py
drwxr-xr-x  19 brownj  staff   646B Nov  5 09:40 metaseq/
-rw-r--r--   1 brownj  staff    73B Nov  5 09:40 requirements.txt
-rw-r--r--   1 brownj  staff   1.7K Nov  5 09:40 setup.py
brownj at bbp in ~/devel/metaseq on master

$ python setup.py install
running install
running bdist_egg
running egg_info
creating metaseq.egg-info
writing requirements to metaseq.egg-info/requires.txt
writing metaseq.egg-info/PKG-INFO
writing top-level names to metaseq.egg-info/top_level.txt
writing dependency_links to metaseq.egg-info/dependency_links.txt
writing manifest file 'metaseq.egg-info/SOURCES.txt'
reading manifest file 'metaseq.egg-info/SOURCES.txt'
writing manifest file 'metaseq.egg-info/SOURCES.txt'
installing library code to build/bdist.macosx-10.7-x86_64/egg
running install_lib
running build_py
creating build
creating build/lib.macosx-10.7-x86_64-2.7
creating build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/__init__.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/array_helpers.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/colormap_adjust.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/filetype_adapters.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/genomic_signal.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/helpers.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/minibrowser.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/plotutils.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/rebin_naive.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/results_table.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/stats.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/tables.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
copying metaseq/version.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq
creating build/lib.macosx-10.7-x86_64-2.7/metaseq/test
copying metaseq/test/__init__.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test
copying metaseq/test/test.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test
creating build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/__init__.py -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/construct_example_deseq.R -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/ex.deseq -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/gdc.bam -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/gdc.bam.bai -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/gdc.bed -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/gdc.bigbed -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
copying metaseq/test/data/x.bam -> build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data
running build_ext
cythoning metaseq/rebin.pyx to metaseq/rebin.c
building 'metaseq.rebin' extension
creating build/temp.macosx-10.7-x86_64-2.7
creating build/temp.macosx-10.7-x86_64-2.7/metaseq
/usr/bin/clang -fno-strict-aliasing -Os -w -pipe -march=native -Qunused-arguments -mmacosx-version-min=10.7 -fwrapv -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/usr/local/lib/python2.7/site-packages/numpy/core/include -I/usr/local/Cellar/python/2.7.3/include/python2.7 -c metaseq/rebin.c -o build/temp.macosx-10.7-x86_64-2.7/metaseq/rebin.o
/usr/bin/clang -bundle -undefined dynamic_lookup -L/usr/local/Cellar/readline/6.2.2/lib -L/usr/local/lib build/temp.macosx-10.7-x86_64-2.7/metaseq/rebin.o -o build/lib.macosx-10.7-x86_64-2.7/metaseq/rebin.so
creating build/bdist.macosx-10.7-x86_64
creating build/bdist.macosx-10.7-x86_64/egg
creating build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/__init__.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/array_helpers.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/colormap_adjust.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/filetype_adapters.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/genomic_signal.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/helpers.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/minibrowser.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/plotutils.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/rebin.so -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/rebin_naive.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/results_table.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/stats.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/tables.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
creating build/bdist.macosx-10.7-x86_64/egg/metaseq/test
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/__init__.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test
creating build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/__init__.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/construct_example_deseq.R -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/ex.deseq -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/gdc.bam -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/gdc.bam.bai -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/gdc.bed -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/gdc.bigbed -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/data/x.bam -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/test/test.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq/test
copying build/lib.macosx-10.7-x86_64-2.7/metaseq/version.py -> build/bdist.macosx-10.7-x86_64/egg/metaseq
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/__init__.py to __init__.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/array_helpers.py to array_helpers.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/colormap_adjust.py to colormap_adjust.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/filetype_adapters.py to filetype_adapters.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/genomic_signal.py to genomic_signal.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/helpers.py to helpers.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/minibrowser.py to minibrowser.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/plotutils.py to plotutils.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/rebin_naive.py to rebin_naive.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/results_table.py to results_table.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/stats.py to stats.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/tables.py to tables.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/test/__init__.py to __init__.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/test/data/__init__.py to __init__.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/test/test.py to test.pyc
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/version.py to version.pyc
creating stub loader for metaseq/rebin.so
byte-compiling build/bdist.macosx-10.7-x86_64/egg/metaseq/rebin.py to rebin.pyc
creating build/bdist.macosx-10.7-x86_64/egg/EGG-INFO
installing scripts to build/bdist.macosx-10.7-x86_64/egg/EGG-INFO/scripts
running install_scripts
running build_scripts
creating build/scripts-2.7
copying and adjusting metaseq/scripts/download_metaseq_example_data.py -> build/scripts-2.7
changing mode of build/scripts-2.7/download_metaseq_example_data.py from 644 to 755
creating build/bdist.macosx-10.7-x86_64/egg/EGG-INFO/scripts
copying build/scripts-2.7/download_metaseq_example_data.py -> build/bdist.macosx-10.7-x86_64/egg/EGG-INFO/scripts
changing mode of build/bdist.macosx-10.7-x86_64/egg/EGG-INFO/scripts/download_metaseq_example_data.py to 755
copying metaseq.egg-info/PKG-INFO -> build/bdist.macosx-10.7-x86_64/egg/EGG-INFO
copying metaseq.egg-info/SOURCES.txt -> build/bdist.macosx-10.7-x86_64/egg/EGG-INFO
copying metaseq.egg-info/dependency_links.txt -> build/bdist.macosx-10.7-x86_64/egg/EGG-INFO
copying metaseq.egg-info/requires.txt -> build/bdist.macosx-10.7-x86_64/egg/EGG-INFO
copying metaseq.egg-info/top_level.txt -> build/bdist.macosx-10.7-x86_64/egg/EGG-INFO
writing build/bdist.macosx-10.7-x86_64/egg/EGG-INFO/native_libs.txt
zip_safe flag not set; analyzing archive contents...
metaseq.helpers: module references __file__
creating dist
creating 'dist/metaseq-0.1dev-py2.7-macosx-10.7-x86_64.egg' and adding 'build/bdist.macosx-10.7-x86_64/egg' to it
removing 'build/bdist.macosx-10.7-x86_64/egg' (and everything under it)
Processing metaseq-0.1dev-py2.7-macosx-10.7-x86_64.egg
creating /usr/local/lib/python2.7/site-packages/metaseq-0.1dev-py2.7-macosx-10.7-x86_64.egg
Extracting metaseq-0.1dev-py2.7-macosx-10.7-x86_64.egg to /usr/local/lib/python2.7/site-packages
Adding metaseq 0.1dev to easy-install.pth file
Installing download_metaseq_example_data.py script to /usr/local/share/python

Installed /usr/local/lib/python2.7/site-packages/metaseq-0.1dev-py2.7-macosx-10.7-x86_64.egg
Processing dependencies for metaseq==0.1dev
Searching for matplotlib
Reading http://pypi.python.org/simple/matplotlib/
Reading http://matplotlib.sourceforge.net
Reading http://sourceforge.net/project/showfiles.php?group_id=80706&package_id=82474
error: None

Everything seemed to install except for metaseq.integration, except now I receive:

In [1]: import metaseq
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-3365faa9d779> in <module>()
----> 1 import metaseq

/Users/brownj/devel/metaseq/metaseq/__init__.py in <module>()
      4 from helpers import data_dir, example_filename, nice_colormap, \
      5         gfffeature_to_interval
----> 6 from genomic_signal import genomic_signal
      7 import plotutils
      8 import integration

/Users/brownj/devel/metaseq/metaseq/genomic_signal.py in <module>()
     32 from bx.bbi.bigwig_file import BigWigFile
     33 
---> 34 from array_helpers import _array, _array_parallel, _local_coverage, \
     35     _local_coverage_bigwig, _local_count
     36 import filetype_adapters

/Users/brownj/devel/metaseq/metaseq/array_helpers.py in <module>()
      6 import genomic_signal
      7 import sys
----> 8 from rebin import rebin, float_rebin
      9 from helpers import chunker
     10 import filetype_adapters

ImportError: No module named rebin

I can't remember at which step statsmodels was required, but it should probably be added to your required packages list.

mapped_read_count() OSError: No such file or directory

I am running into difficulty getting the mapped_read_count() function to work. I'm not sure if the OSError is about the BAM file, or some other file I don't know that I need (first ran into this issue following example 1, which worked successfully up to this point.)

The following code uses the example 1 data from https://raw.githubusercontent.com/daler/metaseq-example-data/master/metaseq-example-data.tar.gz, and reproduces the error for me:

import metaseq
import os

bampath = "wgEncodeHaibTfbsK562Atf3V0416101AlnRep1_chr17.bam"

# other options I have tried, with the same result:
#bampath = "metaseq-example/data/wgEncodeHaibTfbsK562Atf3V0416101AlnRep1_chr17.bam"
#bampath = "./wgEncodeHaibTfbsK562Atf3V0416101AlnRep1_chr17.bam"
#bampath = os.path.join(os.getcwd(), "wgEncodeHaibTfbsK562Atf3V0416101AlnRep1_chr17.bam")

print "Path exists?", os.path.exists(bampath)

input_signal = metaseq.genomic_signal(bampath, 'bam')

print input_signal.fn
print input_signal.fn == bampath

input_signal.mapped_read_count()

The path.exists() returns True, and the input_signal.fn is the same as bampath.

The error returned is:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-24-91e69423f63e> in <module>()
----> 1 input_signal.mapped_read_count()

/Users/poxley/anaconda/lib/python2.7/site-packages/metaseq/_genomic_signal.pyc in mapped_read_count(self, force)
    239                 self.fn]
    240         p = subprocess.Popen(
--> 241             cmds, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
    242         stdout, stderr = p.communicate()
    243         if stderr:

/Users/poxley/anaconda/lib/python2.7/subprocess.pyc in __init__(self, args, bufsize, executable, stdin, stdout, stderr, preexec_fn, close_fds, shell, cwd, env, universal_newlines, startupinfo, creationflags)
    709                                 p2cread, p2cwrite,
    710                                 c2pread, c2pwrite,
--> 711                                 errread, errwrite)
    712         except Exception:
    713             # Preserve original exception in case os.close raises.

/Users/poxley/anaconda/lib/python2.7/subprocess.pyc in _execute_child(self, args, executable, preexec_fn, close_fds, cwd, env, universal_newlines, startupinfo, creationflags, shell, to_close, p2cread, p2cwrite, c2pread, c2pwrite, errread, errwrite)
   1341                         raise
   1342                 child_exception = pickle.loads(data)
-> 1343                 raise child_exception
   1344 
   1345 

OSError: [Errno 2] No such file or directory

Any idea which file it is referring to? Or why it can't find the bamfile?
This was run on a Jupyter notebook, with the Python 2.7 kernel.

plotutils.imshow for just a heatmap alone

currently metaseq.plotutils.heatmap forces a line plot at the bottom of the figure. There should be a way of calling the function such that only an "array_axes" is created.

genomic signal array

I am having trouble implementing example 1. I have tried with the provided data as well as a data set I am interested in (bigWig). I am getting an error at the BamSignal.array step. The code follows:

if not os.path.exists('example.npz'):
... ip_array = ip_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
... input_array = input_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
... ip_array /= ip_signal.mapped_read_count() / 1e6
... input_array /= input_signal.mapped_read_count() / 1e6
... metaseq.persistence.save_features_and_arrays(
... features=tsses,
... arrays={'ip': ip_array, 'input': input_array},
... prefix='example',
... link_features=True,
... overwrite=True)
...
Traceback (most recent call last):
File "", line 5, in
File "/home/keller/miniconda/envs/metaseq-test/lib/python2.7/site-packages/metaseq/_genomic_signal.py", line 126, in array
stacked_arrays = np.row_stack(arrays)
File "/home/keller/miniconda/envs/metaseq-test/lib/python2.7/site-packages/numpy/core/shape_base.py", line 228, in vstack
return _nx.concatenate([atleast_2d(_m) for _m in tup], 0)
ValueError: need at least one array to concatenate

I tried to set up just one array at a time:

input_array = input_signal.array(
... tsses_1kb,
... bins=100,
... processes=processes)
Traceback (most recent call last):
File "", line 4, in
File "/home/keller/miniconda/envs/metaseq-test/lib/python2.7/site-packages/metaseq/_genomic_signal.py", line 126, in array
stacked_arrays = np.row_stack(arrays)
File "/home/keller/miniconda/envs/metaseq-test/lib/python2.7/site-packages/numpy/core/shape_base.py", line 228, in vstack
return _nx.concatenate([atleast_2d(_m) for _m in tup], 0)

Hopefully you can point me to a solution. Thanks.

genomic_signal

Hello again. I am making an average plot of Bigwig (signal) files and a gff3 reference file (midpoint of binding sites). I have succesfully used this script before, but when I substitute this new gff3 file, I receive the following error message:
$ python hts.py
Traceback (most recent call last):
File "hts.py", line 49, in
processes=processes)
File "/usr/local/lib/python2.7/dist-packages/metaseq/_genomic_signal.py", line 122, in array
chunksize=chunksize, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/metaseq/array_helpers.py", line 371, in _array_parallel
chunks = list(chunker(genelist, chunksize))
File "/usr/local/lib/python2.7/dist-packages/metaseq/helpers.py", line 27, in chunker
x.append(f.next())
File "pybedtools/cbedtools.pyx", line 787, in pybedtools.cbedtools.IntervalIterator.next (pybedtools/cbedtools.cxx:11136)
File "pybedtools/cbedtools.pyx", line 697, in pybedtools.cbedtools.create_interval_from_list (pybedtools/cbedtools.cxx:9933)
pybedtools.cbedtools.MalformedBedLineError: Start is greater than stop

The final line "MalformedBedLineError: Start is greater than stop". Indicated to me that my Gff3 reference file was malformed. So I verified the start/ stop positions of that file and found them to be correct (ie start is less than stop). Any advice would be greatly appreciated. Thanks for all the help!

metaseq mouse mm9 gtf file ensemble error

Hi,
I have just installed metaseq with the recommended dependencies.My python version is 2.7.3.
I am using bowtie generated bam files aligned against mouse genome.
The command as recommended to read bam files is;

ip_bam = metaseq.example_filename(
... 'galaxy58.bam')
control_bam = metaseq.example_filename(
... 'galaxy226.bam')
dbfn=metaseq.example_filename(
... 'Mus_musculus.NCBIM37.64.gtf.gz')
chip = chipseq.Chipseq(ip_bam=ip_bam, control_bam=control_bam,
... dbfn=dbfn)

Traceback (most recent call last):
File "", line 2, in
File "metaseq/integration/chipseq.py", line 96, in init
self.db = gffutils.FeatureDB(dbfn)
File "/usr/local/lib/python2.7/dist-packages/gffutils-0.7-py2.7-linux-i686.egg/gffutils/db.py", line 421, in init
''')
sqlite3.DatabaseError: file is encrypted or is not a database

My system is ubuntu 12.04.
Do I need to install python sqlite or python pysqlite 2.0 (python interface for sqlite3.0) ?
Are further commands required to proces the gtf file.

Any tips are highly appreciated.
suman

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.