Git Product home page Git Product logo

ni1o1 / transbigdata Goto Github PK

View Code? Open in Web Editor NEW
439.0 9.0 113.0 165.84 MB

A Python package develop for transportation spatio-temporal big data processing, analysis and visualization.

Home Page: https://transbigdata.readthedocs.io/en/latest/

License: BSD 3-Clause "New" or "Revised" License

Python 100.00%
python data-visualization data-analysis transportation spatio-temporal-data geospatial-data data-quality-analysis data-pre-processing taxi-gps-data bus-gps-data

transbigdata's Introduction

English 中文版

TransBigData

Documentation Status Downloads Downloads Tests codecov

Introduction

TransBigData is a Python package developed for transportation spatio-temporal big data processing, analysis and visualization. TransBigData provides fast and concise methods for processing common transportation spatio-temporal big data such as Taxi GPS data, bicycle sharing data and bus GPS data. TransBigData provides a variety of processing methods for each stage of transportation spatio-temporal big data analysis. The code with TransBigData is clean, efficient, flexible, and easy to use, allowing complex data tasks to be achieved with concise code.

For some specific types of data, TransBigData also provides targeted tools for specific needs, such as extraction of Origin and Destination(OD) of taxi trips from taxi GPS data and identification of arrival and departure information from bus GPS data. The latest stable release of the software can be installed via pip and full documentation can be found at https://transbigdata.readthedocs.io/en/latest/. Introduction PPT can be found here and here(in Chinese)

Target Audience

The target audience of TransBigData includes:

  • Data science researchers and data engineers in the field of transportation big data, smart transportation systems, and urban computing, particularly those who want to integrate innovative algorithms into intelligent trasnportation systems
  • Government, enterprises, or other entities who expect efficient and reliable management decision support through transportation spatio-temporal data analysis.

Technical Features

  • Provide a variety of processing methods for each stage of transportation spatio-temporal big data analysis.
  • The code with TransBigData is clean, efficient, flexible, and easy to use, allowing complex data tasks to be achieved with concise code.

Main Functions

Currently, TransBigData mainly provides the following methods:

  • Data Quality: Provides methods to quickly obtain the general information of the dataset, including the data amount the time period and the sampling interval.
  • Data Preprocess: Provides methods to clean multiple types of data error.
  • Data Gridding: Provides methods to generate multiple types of geographic grids (Rectangular grids, Hexagonal grids) in the research area. Provides fast algorithms to map GPS data to the generated grids.
  • Data Aggregating: Provides methods to aggregate GPS data and OD data into geographic polygon.
  • Data Visualization: Built-in visualization capabilities leverage the visualization package keplergl to interactively visualize data on Jupyter notebook with simple code.
  • Trajectory Processing: Provides methods to process trajectory data, including generating trajectory linestring from GPS points, and trajectory densification, etc.
  • Basemap Loading: Provides methods to display Mapbox basemap on matplotlib figures

Grid processing framework offered by TransBigData

Here is an overview of the gridding framework offered by TransBigData.

1648715064154.png

See This Example for further details.

Trajectory processing framework offered by TransBigData

Here is an overview of the Trajectory processing framework offered by TransBigData.

trajs.png

See This Example for further details.

Installation

TransBigData support Python >= 3.6

Using pypi PyPI version

TransBigData can be installed by using pip install. Before installing TransBigData, make sure that you have installed the available geopandas package. If you already have geopandas installed, run the following code directly from the command prompt to install TransBigData:

pip install transbigdata

Using conda-forge Conda Version Conda Downloads

You can also install TransBigData by conda-forge, this will automaticaly solve the dependency, it can be installed with:

conda install -c conda-forge transbigdata

Contributing to TransBigData GitHub contributors Join the chat at https://gitter.im/transbigdata/community GitHub commit activity

All contributions, bug reports, bug fixes, documentation improvements, enhancements and ideas are welcome. A detailed overview on how to contribute can be found in the contributing guide on GitHub.

Examples

Example of data visualization

Visualize trajectories (with keplergl)

gif

Visualize data distribution (with keplergl)

gif

Visualize OD (with keplergl)

gif

Example of taxi GPS data processing

The following example shows how to use the TransBigData to perform data gridding, data aggregating and data visualization for taxi GPS data.

Read the data

import transbigdata as tbd
import pandas as pd
#Read taxi gps data  
data = pd.read_csv('TaxiData-Sample.csv',header = None) 
data.columns = ['VehicleNum','time','lon','lat','OpenStatus','Speed'] 
data
VehicleNum time lon lat OpenStatus Speed
0 34745 20:27:43 113.806847 22.623249 1 27
1 34745 20:24:07 113.809898 22.627399 0 0
2 34745 20:24:27 113.809898 22.627399 0 0
3 34745 20:22:07 113.811348 22.628067 0 0
4 34745 20:10:06 113.819885 22.647800 0 54
... ... ... ... ... ... ...
544994 28265 21:35:13 114.321503 22.709499 0 18
544995 28265 09:08:02 114.322701 22.681700 0 0
544996 28265 09:14:31 114.336700 22.690100 0 0
544997 28265 21:19:12 114.352600 22.728399 0 0
544998 28265 19:08:06 114.137703 22.621700 0 0

544999 rows × 6 columns

Data pre-processing

Define the study area and use the tbd.clean_outofbounds method to delete the data out of the study area

#Define the study area
bounds = [113.75, 22.4, 114.62, 22.86]
#Delete the data out of the study area
data = tbd.clean_outofbounds(data,bounds = bounds,col = ['lon','lat'])

Data gridding

The most basic way to express the data distribution is in the form of geograpic grids. TransBigData provides methods to generate multiple types of geographic grids (Rectangular grids, Hexagonal grids) in the research area. For rectangular gridding, you need to determine the gridding parameters at first (which can be interpreted as defining a grid coordinate system):

#Obtain the gridding parameters
params = tbd.area_to_params(bounds,accuracy = 1000)
params

{'slon': 113.75, 'slat': 22.4, 'deltalon': 0.00974336289289822, 'deltalat': 0.008993210412845813, 'theta': 0, 'method': 'rect', 'gridsize': 1000}

The gridding parameters store the information of the initial position, the size and the angle of the gridding system.

The next step is to map the GPS data to their corresponding grids. Using the tbd.GPS_to_grid, it will generate the LONCOL column and the LATCOL column (Rectangular grids). The two columns together can specify a grid:

#Map the GPS data to grids
data['LONCOL'],data['LATCOL'] = tbd.GPS_to_grid(data['lon'],data['lat'],params)

Count the amount of data in each grids, generate the geometry of the grids and transform it into a GeoDataFrame:

#Aggregate data into grids
grid_agg = data.groupby(['LONCOL','LATCOL'])['VehicleNum'].count().reset_index()
#Generate grid geometry
grid_agg['geometry'] = tbd.grid_to_polygon([grid_agg['LONCOL'],grid_agg['LATCOL']],params)
#Change the type into GeoDataFrame
import geopandas as gpd
grid_agg = gpd.GeoDataFrame(grid_agg)
#Plot the grids
grid_agg.plot(column = 'VehicleNum',cmap = 'autumn_r')

png

Triangle and Hexagon grids & rotation angle

TransBigData also support the triangle and hexagon grids. It also supports given rotation angle for the grids. We can alter the gridding parameter:

#set to the hexagon grids
params['method'] = 'hexa'
#or set as triangle grids: params['method'] = 'tri'
#set a rotation angle (degree)
params['theta'] = 5

Then we can do the GPS data matching again:

#Triangle and Hexagon grids requires three columns to store ID
data['loncol_1'],data['loncol_2'],data['loncol_3'] = tbd.GPS_to_grid(data['lon'],data['lat'],params)
#Aggregate data into grids
grid_agg = data.groupby(['loncol_1','loncol_2','loncol_3'])['VehicleNum'].count().reset_index()
#Generate grid geometry
grid_agg['geometry'] = tbd.grid_to_polygon([grid_agg['loncol_1'],grid_agg['loncol_2'],grid_agg['loncol_3']],params)
#Change the type into GeoDataFrame
import geopandas as gpd
grid_agg = gpd.GeoDataFrame(grid_agg)
#Plot the grids
grid_agg.plot(column = 'VehicleNum',cmap = 'autumn_r')

1648714436503.png

Data Visualization(with basemap)

For a geographical data visualization figure, we still have to add the basemap, the colorbar, the compass and the scale. Use tbd.plot_map to load the basemap and tbd.plotscale to add compass and scale in matplotlib figure:

import matplotlib.pyplot as plt
fig =plt.figure(1,(8,8),dpi=300)
ax =plt.subplot(111)
plt.sca(ax)
#Load basemap
tbd.plot_map(plt,bounds,zoom = 11,style = 4)
#Define colorbar
cax = plt.axes([0.05, 0.33, 0.02, 0.3])
plt.title('Data count')
plt.sca(ax)
#Plot the data
grid_agg.plot(column = 'VehicleNum',cmap = 'autumn_r',ax = ax,cax = cax,legend = True)
#Add scale
tbd.plotscale(ax,bounds = bounds,textsize = 10,compasssize = 1,accuracy = 2000,rect = [0.06,0.03],zorder = 10)
plt.axis('off')
plt.xlim(bounds[0],bounds[2])
plt.ylim(bounds[1],bounds[3])
plt.show()

1648714582961.png

Citation information DOI status

Please cite this when using TransBigData in your research. Citation information is as follows:

@article{Yu2022,
  doi       = {10.21105/joss.04021},
  url       = {https://doi.org/10.21105/joss.04021},
  year      = {2022},
  publisher = {The Open Journal},
  volume    = {7},
  number    = {71},
  pages     = {4021},
  author    = {Qing Yu and Jian Yuan},
  title     = {TransBigData: A Python package for transportation spatio-temporal big data processing, analysis and visualization},
  journal   = {Journal of Open Source Software}
}

Introducing Video (In Chinese) bilibili

transbigdata's People

Contributors

anitagraser avatar gitter-badger avatar imgbotapp avatar jgaboardi avatar martinfleis avatar ni1o1 avatar xuanxu avatar yuanjian24 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

transbigdata's Issues

`traj_clean_drift`函数的筛选条件似乎不对

作者您好,非常感谢您的开源工作以及各个平台上的详实教程,实在是让我受益良多,最近在使用您的库做实验的时候,发现排除异常点的算法似乎不对,还望您不吝赐教。

Describe the bug

具体位置在:

data1 = data1[
-((data1[VehicleNum+'_pre'] == data1[VehicleNum]) &
(data1[VehicleNum+'_next'] == data1[VehicleNum]) &
(data1['speed_pre'] > speedlimit) &
(data1['speed_next'] > speedlimit) &
(data1['speed_prenext'] < speedlimit))]

这个部分是根据速度限制对数据进行清理,超过speedlimit的部分将会被去掉,但是源代码中的语义似乎是当前,后的速度都超过阈值,且忽略中间点的前后速度小于阈值才把这一行去掉,但是这个是几乎不可能实现的条件呀,正确的语义我猜应该是:任意一个速度值超过阈值就把这一行去掉。所以这里应该改成:

            data1 = data1[
                -((data1[VehicleNum+'_pre'] == data1[VehicleNum]) &
                  (data1[VehicleNum+'_next'] == data1[VehicleNum]) &
                    ((data1['speed_pre'] > speedlimit) |
                    (data1['speed_next'] > speedlimit) |
                    (data1['speed_prenext'] < speedlimit)))]

此外,为了加快代码效率,我觉得后面的这个判断条件应该放在前面,大部分条件下不满足速度超过阈值的条件,就可以直接短路了。这是进一步格式化之后的代码:

            data1 = data1[
                -(
                        (
                                (data1['speed_pre'] > speedlimit) |
                                (data1['speed_next'] > speedlimit) |
                                (data1['speed_prenext'] > speedlimit)
                        ) &
                        (
                                (data1[VehicleNum + '_pre'] == data1[VehicleNum]) &
                                (data1[VehicleNum + '_next'] == data1[VehicleNum])
                        )
                )]

同理,后面的距离限制和角度限制也需要作此修改。

When I import latitude and longitude data for visualization, I get this error

TypeError Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_27080/1699484629.py in
----> 1 data['LONCOL'],data['LATCOL'] = tbd.GPS_to_grids(data['longitude'],data['latitude'],params)

D:\anaconda3\envs\pytorch\lib\site-packages\transbigdata\grids.py in GPS_to_grids(*args, **kwargs)
1227 def GPS_to_grids(*args, **kwargs):
1228 warnings.warn("This method is renamed as transbigdata.GPS_to_grid")
-> 1229 return GPS_to_grid(*args, **kwargs)
1230
1231

D:\anaconda3\envs\pytorch\lib\site-packages\transbigdata\grids.py in GPS_to_grid(lon, lat, params)
240 method = params['method']
241 if method == 'rect':
--> 242 loncol, latcol = GPS_to_grids_rect(lon, lat, params)
243 return [loncol, latcol]
244 if method == 'tri':

D:\anaconda3\envs\pytorch\lib\site-packages\transbigdata\grids.py in GPS_to_grids_rect(lon, lat, params, from_origin)
814 coords = coords - (np.array([lonStart, latStart]))
815 else:
--> 816 coords = coords - (np.array([lonStart, latStart]) - R[0, :] / 2 -
817 R[1, :] / 2)
818 res = np.floor(np.dot(coords, np.linalg.inv(R)))

TypeError: unsupported operand type(s) for -: 'str' and 'float'

FileNotFindError in plotmap.py

There were some problems when I was trying to set imgsavepath, it seems like “filepath = searchfile('mapboxtoken.txt')” in line 68 doesn't work. when I typing the following code in my jupyternotebook.

import transbigdata as tbd
tbd.set_imgsavepath(r'/home/liqi/CodeSpace/Map_tile/')

I don't know if it was caused by miniconda virtual environment or something else.

Screenshots
image

  • OS: Ubuntu20.04LTS

[JOSS] target audience?

Regarding the Statement of Need, the docs and manuscript clearly state what problems are being solved, but there is no mentioned of who the target audience is.

This should be added to:

  • the actual manuscript
  • README.md
  • the docs site

For reference:

  • Documentation
    Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Software paper
    Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?

xref: openjournals/joss-reviews#4021

'MultiLineString

#从中提取其中的到离站轨迹
import shapely
if line_intersection.is_empty:
pass
else:
#将每一次到站信息放入list中
if type(line_intersection) == shapely.geometry.linestring.LineString:
arrive = [line_intersection]
else:
arrive = list(line_intersection)
#构建为DataFrame
arrive = pd.DataFrame(arrive)
#取每一次到离站时间
arrive['arrivetime']= arrive[0].apply(lambda r:r.coords[0][0])
arrive['leavetime']= arrive[0].apply(lambda r:r.coords[-1][0])
arrive

出现错误:'MultiLineString' object is not iterable怎么解决

关于车辆轨迹数据可视化

余博您好,感谢您的工作,想向您请教一个问题。请问如果在不区分deliver和ilde的情况下,利用“traj_to_linestring”直接对整体轨迹数据进行可视化,此时应该传入哪个轨迹数据:清洗后的数据 or 轨迹切片的move_points? 以及 此时的参数”ID“应该是”vehicleNum“吗?

plot_map函数改进

plot_map函数目前仅支持wgs_84坐标系获取底图。

但用户存在以下情景需求时,该功能是否应该考虑完善:

欧美地区的地理空间数据往往需要使用区域特制crs,若想通过tbd.plot_map实现底图绘制,需要强制将数据crs转换为4326使数据与底图坐标一致。

但将欧美地区地理空间crs转换为4326时,其地理空间数据形变非常大,非常影响感官。

因此,是否存在解决该问题的改善方案?

获取行政区报错

运行
import transbigdata
transbigdata.getadmin(510104,'c5bXXXXXXXXXXX6XXXXXXXXX',subdistricts=True)

报错
KeyError Traceback (most recent call last)
~\AppData\Local\Temp/ipykernel_9772/996543621.py in
1 import transbigdata
----> 2 transbigdata.getadmin(510104,'c5ba0abd32f03e57446a5521ebfa0d59',subdistricts=True)

D:\jb\lib\site-packages\transbigdata\crawler.py in getadmin(keyword, ak, subdistricts)
99 datas = []
100 k = 0
--> 101 polyline = result['districts'][k]['polyline']
102 polyline1 = polyline.split('|')
103 res = []

KeyError: 'districts'

[JOSS] State of the field?

Currently no other similar software packages are mentioned in the manuscript, which is a requirement. There should be an overview of comparable packages, even/especially if they are not written in Python. Further, if no other packages exist this should be mentioned.

xref: this comment from @anitagraser.

OD集计可视化时遇到的问题

学长您好,今天我按照共享单车数据社区发现那个例子,来用货车的gps数据进行可视化时,遇到如图所示的问题,OD集计可视化效果没有达到预期,请问是什么原因造成的呢?哪个参数或哪个地方需要进行修正,谢谢您的答复!
Uploading 1654166291383.png…
1654166304040
1654166249763

利用tbd.getbusdata函数获取城市公交站点数据已失效,显示RemoteDisconnected 连接超时

当我想利用tbd.getbusdata获取公交站点数据时报错连接超时,用的是jupyter notebook,以前没有这样的情况,麻烦学长康康为什莫。

我的代码如下
import geopandas as gpd
import transbigdata as tbd
line,stop = tbd.getbusdata("南京", ['1路' , '2路', '3路'], accurate=True, timeout=20)
gdf = gpd.GeoDataFrame(stop)
output_file = 'TBD南京公交数据.csv'
gdf.to_csv(output_file, index=False, encoding = 'utf-8-sig')

报错如下
RemoteDisconnected Traceback (most recent call last)
D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
702 # Make the request on the httplib connection object.
--> 703 httplib_response = self._make_request(
704 conn,

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
448 # Otherwise it looks like a bug in the code.
--> 449 six.raise_from(e, None)
450 except (SocketTimeout, BaseSSLError, SocketError) as e:

D:\ANACONDA3\lib\site-packages\urllib3\packages\six.py in raise_from(value, from_value)

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
443 try:
--> 444 httplib_response = conn.getresponse()
445 except BaseException as e:

D:\ANACONDA3\lib\http\client.py in getresponse(self)
1376 try:
-> 1377 response.begin()
1378 except ConnectionError:

D:\ANACONDA3\lib\http\client.py in begin(self)
319 while True:
--> 320 version, status, reason = self._read_status()
321 if status != CONTINUE:

D:\ANACONDA3\lib\http\client.py in _read_status(self)
288 # sending a valid response.
--> 289 raise RemoteDisconnected("Remote end closed connection without"
290 " response")

RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

ProtocolError Traceback (most recent call last)
D:\ANACONDA3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
488 if not chunked:
--> 489 resp = conn.urlopen(
490 method=request.method,

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
786
--> 787 retries = retries.increment(
788 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]

D:\ANACONDA3\lib\site-packages\urllib3\util\retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
549 if read is False or not self._is_method_retryable(method):
--> 550 raise six.reraise(type(error), error, _stacktrace)
551 elif read is not None:

D:\ANACONDA3\lib\site-packages\urllib3\packages\six.py in reraise(tp, value, tb)
768 if value.traceback is not tb:
--> 769 raise value.with_traceback(tb)
770 raise value

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
702 # Make the request on the httplib connection object.
--> 703 httplib_response = self._make_request(
704 conn,

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
448 # Otherwise it looks like a bug in the code.
--> 449 six.raise_from(e, None)
450 except (SocketTimeout, BaseSSLError, SocketError) as e:

D:\ANACONDA3\lib\site-packages\urllib3\packages\six.py in raise_from(value, from_value)

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
443 try:
--> 444 httplib_response = conn.getresponse()
445 except BaseException as e:

D:\ANACONDA3\lib\http\client.py in getresponse(self)
1376 try:
-> 1377 response.begin()
1378 except ConnectionError:

D:\ANACONDA3\lib\http\client.py in begin(self)
319 while True:
--> 320 version, status, reason = self._read_status()
321 if status != CONTINUE:

D:\ANACONDA3\lib\http\client.py in _read_status(self)
288 # sending a valid response.
--> 289 raise RemoteDisconnected("Remote end closed connection without"
290 " response")

ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

ConnectionError Traceback (most recent call last)
C:\Windows\Temp\ipykernel_10028\269894834.py in
1 import geopandas as gpd
2 import transbigdata as tbd
----> 3 line,stop = tbd.getbusdata("上海",['轨道交通1号线'],accurate = True,timeout = 20)
4 gdf = gpd.GeoDataFrame(stop)
5 output_file = r"C:\Users\郑迪\pythonScript\工具包\工具包\各个城市基础数据(坐标系未知)\TBD上海站点.csv"

D:\ANACONDA3\lib\site-packages\transbigdata\crawler.py in getbusdata(city, keywords, accurate, timeout)
249 for keyword in keywords:
250 print(keyword)
--> 251 for uid in getlineuid(keyword, c, accurate):
252 if uid not in uids:
253 try:

D:\ANACONDA3\lib\site-packages\transbigdata\crawler.py in getlineuid(keyword, c, acc)
182 url = 'http://map.baidu.com/?qt=s&wd=' +
183 urllib.parse.quote(keyword)+'&c='+c+'&from=webmap'
--> 184 response = requests.get(url)
185 searchinfo = json.loads(response.text)
186 try:

D:\ANACONDA3\lib\site-packages\requests\api.py in get(url, params, **kwargs)
71 """
72
---> 73 return request("get", url, params=params, **kwargs)
74
75

D:\ANACONDA3\lib\site-packages\requests\api.py in request(method, url, **kwargs)
57 # cases, and look like a memory leak in others.
58 with sessions.Session() as session:
---> 59 return session.request(method=method, url=url, **kwargs)
60
61

D:\ANACONDA3\lib\site-packages\requests\sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
585 }
586 send_kwargs.update(settings)
--> 587 resp = self.send(prep, **send_kwargs)
588
589 return resp

D:\ANACONDA3\lib\site-packages\requests\sessions.py in send(self, request, **kwargs)
699
700 # Send the request
--> 701 r = adapter.send(request, **kwargs)
702
703 # Total elapsed time of the request (approximately)

D:\ANACONDA3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
545
546 except (ProtocolError, OSError) as err:
--> 547 raise ConnectionError(err, request=request)
548
549 except MaxRetryError as e:

ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

计算过程中,数据发生变化

大家好,我在用transBigData 聚合集计栅格内数据量 的时候发现,先前都在0~1的数据,计算之后都自动变成1了,怎么在计算过程中保持数据不变呢?
L%1IZ(@1W1U@F8AT17@`VG3

line, stop = tbd.getbusdata('深圳', ['2号线']) 报错:No such busline

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

mapbox图片不显示

mapbox

mapbox1

```

Package Version


appnope 0.1.2
argon2-cffi 21.1.0
attrs 21.2.0
backcall 0.2.0
bleach 4.1.0
blis 0.7.7
catalogue 2.0.7
certifi 2021.10.8
cffi 1.15.0
chardet 4.0.0
click 8.0.4
click-plugins 1.1.1
cligj 0.7.2
commonmark 0.9.1
cycler 0.11.0
cymem 2.0.6
d2l 0.17.3
debugpy 1.5.1
decorator 5.1.0
defusedxml 0.7.1
entrypoints 0.3
filelock 3.6.0
Fiona 1.8.21
geopandas 0.10.2
huggingface-hub 0.5.1
idna 2.10
ipykernel 6.4.2
ipython 7.29.0
ipython-genutils 0.2.0
ipywidgets 7.6.5
jedi 0.18.0
jellyfish 0.9.0
Jinja2 3.0.2
joblib 1.1.0
jsonschema 4.1.2
jupyter 1.0.0
jupyter-client 7.0.6
jupyter-console 6.4.0
jupyter-contrib-core 0.3.3
jupyter-contrib-nbextensions 0.5.1
jupyter-core 4.9.1
jupyter-highlight-selected-word 0.2.0
jupyter-latex-envs 1.4.6
jupyter-nbextensions-configurator 0.4.1
jupyterlab-pygments 0.1.2
jupyterlab-widgets 1.0.2
keplergl 0.3.2
keybert 0.5.1
kiwisolver 1.3.2
langcodes 3.3.0
lxml 4.8.0
mapclassify 2.4.3
MarkupSafe 2.0.1
matplotlib 3.3.3
matplotlib-inline 0.1.3
mistune 0.8.4
multi-rake 0.0.2
munch 2.5.0
murmurhash 1.0.6
nbclient 0.5.4
nbconvert 6.2.0
nbformat 5.1.3
nest-asyncio 1.5.1
networkx 2.8
nltk 3.7
notebook 6.4.5
numpy 1.18.5
packaging 21.2
pandas 1.2.2
pandocfilters 1.5.0
parso 0.8.2
pathy 0.6.1
pexpect 4.8.0
pickleshare 0.7.5
Pillow 8.4.0
pip 22.0.4
plot-map 0.3.7
preshed 3.0.6
prometheus-client 0.12.0
prompt-toolkit 3.0.21
ptyprocess 0.7.0
pycld2 0.41
pycparser 2.20
pydantic 1.8.2
pygeos 0.12.0
Pygments 2.10.0
pyparsing 2.4.7
pyproj 3.3.1
pyrsistent 0.18.0
python-dateutil 2.8.2
pytz 2021.3
PyYAML 6.0
pyzmq 22.3.0
qtconsole 5.1.1
QtPy 1.11.2
regex 2022.3.15
requests 2.25.1
rich 12.2.0
Rtree 1.0.0
sacremoses 0.0.49
scikit-learn 1.0.1
scipy 1.7.2
segtok 1.5.11
Send2Trash 1.8.0
sentence-transformers 2.2.0
sentencepiece 0.1.96
setuptools 56.0.0
Shapely 1.8.1.post1
six 1.16.0
sklearn 0.0
smart-open 5.2.1
spacy 3.2.4
spacy-legacy 3.0.9
spacy-loggers 1.0.2
srsly 2.4.2
summa 1.2.0
tabulate 0.8.9
terminado 0.12.1
testpath 0.5.0
thinc 8.0.15
threadpoolctl 3.0.0
tokenizers 0.12.1
torch 1.8.1
torchvision 0.9.1
tornado 6.1
tqdm 4.64.0
traitlets 5.1.1
traittypes 0.2.1
transbigdata 0.4.5
transformers 4.18.0
treelite 2.2.2
treelite-runtime 2.2.2
typer 0.4.1
typing_extensions 4.1.1
urllib3 1.26.8
wasabi 0.9.1
wcwidth 0.2.5
webencodings 0.5.1
widgetsnbextension 3.5.2
xgboost 1.5.2
yake 0.4.8

采用自定义bounds划分栅格时,出现多余的栅格

采用自定义bounds来划分栅格时,划分出来的格子,超过了设置的bounds(图中白色部分)

猜测可能是坐标系的原因?请问area_to_grid的默认坐标系是什么?

代码:

grid, params = tbd.area_to_grid(boundary, accuracy=3000, method='rect')

#栅格参数,方形栅格下method参数是rect,代表方形栅格
pprint.pprint(params)

#栅格几何图形
grid.head()

输出:

{'deltalat': 0.02697963123853744,
 'deltalon': 0.033639196976294326,
 'gridsize': 3000,
 'method': 'rect',
 'slat': 36.636559,
 'slon': 116.950786,
 'theta': 0}

1699172896887

pyproj报错信息显示,是包的版本问题吗?感谢!

File "pyproj/_crs.pyx", line 2338, in pyproj._crs._CRS.init
PJ_TYPE_VERTICAL_CRS: "Vertical CRS",
pyproj.exceptions.CRSError: Invalid projection: epsg:4326: (Internal Proj Error: proj_create: SQLite error on SELECT name, type, coordinate_system_auth_name, coordinate_system_code, datum_auth_name, datum_code, area_of_use_auth_name, area_of_use_code, text_definition, deprecated FROM geodetic_crs WHERE auth_name = ? AND code = ?: no such column: area_of_use_auth_name)

公交刷卡数据与GPS数据分析公交OD

学长,打扰了!
有没有对公交数据进一步分析的想法,基于刷卡数据和GPS数据,对上下客站点推断,获取公交OD,进一步可以分析公交线路其他运营指标?

此致
感谢!

【百度深度学习开源平台飞桨合作】

您好,我是百度飞桨的产品经理施依欣,看到transbigdata提供了较丰富的交通时空大数据分析能力,飞桨作为国内首个开源的深度学习平台,希望能不断深耕各行各业,因此想要和您进一步沟通,看看是否能够有深入结合/合作的机会,如果您方便的话可以添加我的微信(同电话):18108656919

期待您的回复~

施依欣 | 百度飞桨产品经理

[JOSS] Missing dependencies after conda install

I followed the conda install instructions

You can also install TransBigData by conda-forge, this will automaticaly solve the dependency, it can be installed with:

conda install -c conda-forge transbigdata

and it seems that not all dependencies are resolved because I get

ModuleNotFoundError: No module named 'scipy'

image

xref: openjournals/joss-reviews#4021

利用tbd.getbusdata函数获取城市公交站点数据已失效,显示RemoteDisconnected 连接超时

当我想利用tbd.getbusdata获取公交站点数据时报错连接超时,用的是jupyter notebook,以前没有这样的情况,麻烦学长康康为什莫。

我的代码如下:
import geopandas as gpd
import transbigdata as tbd
line,stop = tbd.getbusdata("南京", ['1路' , '2路', '3路'], accurate=True, timeout=20)
gdf = gpd.GeoDataFrame(stop)
output_file = 'TBD南京公交数据.csv'
gdf.to_csv(output_file, index=False, encoding = 'utf-8-sig')

jupyter报错如下:
RemoteDisconnected Traceback (most recent call last)
D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
702 # Make the request on the httplib connection object.
--> 703 httplib_response = self._make_request(
704 conn,

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
448 # Otherwise it looks like a bug in the code.
--> 449 six.raise_from(e, None)
450 except (SocketTimeout, BaseSSLError, SocketError) as e:

D:\ANACONDA3\lib\site-packages\urllib3\packages\six.py in raise_from(value, from_value)

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
443 try:
--> 444 httplib_response = conn.getresponse()
445 except BaseException as e:

D:\ANACONDA3\lib\http\client.py in getresponse(self)
1376 try:
-> 1377 response.begin()
1378 except ConnectionError:

D:\ANACONDA3\lib\http\client.py in begin(self)
319 while True:
--> 320 version, status, reason = self._read_status()
321 if status != CONTINUE:

D:\ANACONDA3\lib\http\client.py in _read_status(self)
288 # sending a valid response.
--> 289 raise RemoteDisconnected("Remote end closed connection without"
290 " response")

RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

ProtocolError Traceback (most recent call last)
D:\ANACONDA3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
488 if not chunked:
--> 489 resp = conn.urlopen(
490 method=request.method,

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
786
--> 787 retries = retries.increment(
788 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]

D:\ANACONDA3\lib\site-packages\urllib3\util\retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
549 if read is False or not self._is_method_retryable(method):
--> 550 raise six.reraise(type(error), error, _stacktrace)
551 elif read is not None:

D:\ANACONDA3\lib\site-packages\urllib3\packages\six.py in reraise(tp, value, tb)
768 if value.traceback is not tb:
--> 769 raise value.with_traceback(tb)
770 raise value

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
702 # Make the request on the httplib connection object.
--> 703 httplib_response = self._make_request(
704 conn,

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
448 # Otherwise it looks like a bug in the code.
--> 449 six.raise_from(e, None)
450 except (SocketTimeout, BaseSSLError, SocketError) as e:

D:\ANACONDA3\lib\site-packages\urllib3\packages\six.py in raise_from(value, from_value)

D:\ANACONDA3\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
443 try:
--> 444 httplib_response = conn.getresponse()
445 except BaseException as e:

D:\ANACONDA3\lib\http\client.py in getresponse(self)
1376 try:
-> 1377 response.begin()
1378 except ConnectionError:

D:\ANACONDA3\lib\http\client.py in begin(self)
319 while True:
--> 320 version, status, reason = self._read_status()
321 if status != CONTINUE:

D:\ANACONDA3\lib\http\client.py in _read_status(self)
288 # sending a valid response.
--> 289 raise RemoteDisconnected("Remote end closed connection without"
290 " response")

ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

ConnectionError Traceback (most recent call last)
C:\Windows\Temp\ipykernel_10028\269894834.py in
1 import geopandas as gpd
2 import transbigdata as tbd
----> 3 line,stop = tbd.getbusdata("上海",['轨道交通1号线'],accurate = True,timeout = 20)
4 gdf = gpd.GeoDataFrame(stop)
5 output_file = r"C:\Users\pythonScript\工具包\工具包\各个城市基础数据(坐标系未知)\TBD上海站点.csv"

D:\ANACONDA3\lib\site-packages\transbigdata\crawler.py in getbusdata(city, keywords, accurate, timeout)
249 for keyword in keywords:
250 print(keyword)
--> 251 for uid in getlineuid(keyword, c, accurate):
252 if uid not in uids:
253 try:

D:\ANACONDA3\lib\site-packages\transbigdata\crawler.py in getlineuid(keyword, c, acc)
182 url = 'http://map.baidu.com/?qt=s&wd=' +
183 urllib.parse.quote(keyword)+'&c='+c+'&from=webmap'
--> 184 response = requests.get(url)
185 searchinfo = json.loads(response.text)
186 try:

D:\ANACONDA3\lib\site-packages\requests\api.py in get(url, params, **kwargs)
71 """
72
---> 73 return request("get", url, params=params, **kwargs)
74
75

D:\ANACONDA3\lib\site-packages\requests\api.py in request(method, url, **kwargs)
57 # cases, and look like a memory leak in others.
58 with sessions.Session() as session:
---> 59 return session.request(method=method, url=url, **kwargs)
60
61

D:\ANACONDA3\lib\site-packages\requests\sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
585 }
586 send_kwargs.update(settings)
--> 587 resp = self.send(prep, **send_kwargs)
588
589 return resp

D:\ANACONDA3\lib\site-packages\requests\sessions.py in send(self, request, **kwargs)
699
700 # Send the request
--> 701 r = adapter.send(request, **kwargs)
702
703 # Total elapsed time of the request (approximately)

D:\ANACONDA3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
545
546 except (ProtocolError, OSError) as err:
--> 547 raise ConnectionError(err, request=request)
548
549 except MaxRetryError as e:

ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

module 'transbigdata' has no attribute 'clean_same'

学长好,请问这个方法在新版本已经删除了吗,我在日志中没有找到关于它的说明。它是在您的教材《交通时空大数据分析、挖掘与可视化》的P269有被用到,谢谢!

[JOSS] Testing?

I may be missing something, but there does not appear to be any actual CI testing being performed on transbigdata. It seems that line 36 of .github/workflows/python-app.yml simply runs pytest on test_transbigdata.py, which merely imports the package (and also happens in .travis.yml). If I am incorrect in this assessment, please show where the actual CI testing is being performed.

xref: openjournals/joss-reviews#4021

[JOSS] Community Guidelines?

There are currently no Community Guidelines (Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support). Thorough community guidelines can be added here, and also some more description can be given in the Related Links section in README.md.

xref: openjournals/joss-reviews#4021

Example 1-Taxi GPS data processing 运行失败

运行到5大步骤: Aggregate OD into polygons 出错

# Aggragate OD data to polygons 
# without passing gridding parameters, the algorithm will map the data 
# to polygons directly using their coordinates
od_gdf = tbd.odagg_shape(oddata, sz, round_accuracy=6)
fig = plt.figure(1, (16, 6), dpi=150) # 确定图形高为6,宽为8;图形清晰度
ax1 = plt.subplot(111)
od_gdf.plot(ax=ax1, column='count')
plt.xticks([], fontsize=10)
plt.yticks([], fontsize=10)
plt.title('OD Trips', fontsize=12);

报错信息:
ImportError: Spatial indexes require either rtree or pygeos. See installation instructions at https://geopandas.org/install.html

版本:

Package                           Version
--------------------------------- -----------
appnope                           0.1.2
argon2-cffi                       21.1.0
attrs                             21.2.0
backcall                          0.2.0
bleach                            4.1.0
blis                              0.7.7
catalogue                         2.0.7
certifi                           2021.10.8
cffi                              1.15.0
chardet                           4.0.0
click                             8.0.4
click-plugins                     1.1.1
cligj                             0.7.2
commonmark                        0.9.1
cycler                            0.11.0
cymem                             2.0.6
d2l                               0.17.3
debugpy                           1.5.1
decorator                         5.1.0
defusedxml                        0.7.1
entrypoints                       0.3
filelock                          3.6.0
Fiona                             1.8.21
geopandas                         0.10.2
huggingface-hub                   0.5.1
idna                              2.10
ipykernel                         6.4.2
ipython                           7.29.0
ipython-genutils                  0.2.0
ipywidgets                        7.6.5
jedi                              0.18.0
jellyfish                         0.9.0
Jinja2                            3.0.2
joblib                            1.1.0
jsonschema                        4.1.2
jupyter                           1.0.0
jupyter-client                    7.0.6
jupyter-console                   6.4.0
jupyter-contrib-core              0.3.3
jupyter-contrib-nbextensions      0.5.1
jupyter-core                      4.9.1
jupyter-highlight-selected-word   0.2.0
jupyter-latex-envs                1.4.6
jupyter-nbextensions-configurator 0.4.1
jupyterlab-pygments               0.1.2
jupyterlab-widgets                1.0.2
keplergl                          0.3.2
keybert                           0.5.1
kiwisolver                        1.3.2
langcodes                         3.3.0
lxml                              4.8.0
mapclassify                       2.4.3
MarkupSafe                        2.0.1
matplotlib                        3.3.3
matplotlib-inline                 0.1.3
mistune                           0.8.4
multi-rake                        0.0.2
munch                             2.5.0
murmurhash                        1.0.6
nbclient                          0.5.4
nbconvert                         6.2.0
nbformat                          5.1.3
nest-asyncio                      1.5.1
networkx                          2.8
nltk                              3.7
notebook                          6.4.5
numpy                             1.18.5
packaging                         21.2
pandas                            1.2.2
pandocfilters                     1.5.0
parso                             0.8.2
pathy                             0.6.1
pexpect                           4.8.0
pickleshare                       0.7.5
Pillow                            8.4.0
pip                               22.0.4
plot-map                          0.3.7
preshed                           3.0.6
prometheus-client                 0.12.0
prompt-toolkit                    3.0.21
ptyprocess                        0.7.0
pycld2                            0.41
pycparser                         2.20
pydantic                          1.8.2
Pygments                          2.10.0
pyparsing                         2.4.7
pyproj                            3.3.1
pyrsistent                        0.18.0
python-dateutil                   2.8.2
pytz                              2021.3
PyYAML                            6.0
pyzmq                             22.3.0
qtconsole                         5.1.1
QtPy                              1.11.2
regex                             2022.3.15
requests                          2.25.1
rich                              12.2.0
sacremoses                        0.0.49
scikit-learn                      1.0.1
scipy                             1.7.2
segtok                            1.5.11
Send2Trash                        1.8.0
sentence-transformers             2.2.0
sentencepiece                     0.1.96
setuptools                        56.0.0
Shapely                           1.8.1.post1
six                               1.16.0
sklearn                           0.0
smart-open                        5.2.1
spacy                             3.2.4
spacy-legacy                      3.0.9
spacy-loggers                     1.0.2
srsly                             2.4.2
summa                             1.2.0
tabulate                          0.8.9
terminado                         0.12.1
testpath                          0.5.0
thinc                             8.0.15
threadpoolctl                     3.0.0
tokenizers                        0.12.1
torch                             1.8.1
torchvision                       0.9.1
tornado                           6.1
tqdm                              4.64.0
traitlets                         5.1.1
traittypes                        0.2.1
transbigdata                      0.4.5
transformers                      4.18.0
treelite                          2.2.2
treelite-runtime                  2.2.2
typer                             0.4.1
typing_extensions                 4.1.1
urllib3                           1.26.8
wasabi                            0.9.1
wcwidth                           0.2.5
webencodings                      0.5.1
widgetsnbextension                3.5.2
xgboost                           1.5.2
yake                              0.4.8

构建地铁网络拓扑时遇到的问题

学长好!在构建地铁网络拓扑时,使用tbd.metro_network()函数时报错'DataFrame' object has no attribute 'append',是代码里面的edge = edge1.append(edge2)这一行,然后在网上查了一下说是pandas版本更新后舍弃了append方法,现在使用concat方法好像。我把代码改成edge = pd.concat([edge1, edge2])后就可以正常运行了,不知道学长要不要看看更新一下代码之类的~
学长的教程和Transbigdata库真的给了我很大的帮助,也在这里谢谢学长啦~

轨道网络建模获取最短路径时遇到一个小问题

学长您好,我近日在transbigdata轨道交通网络建模,获取最短路径时,发现如果输入的起点站/终点站为换乘站时,会出现起始站/终到站就需要换乘的小bug
比如从广州火车站到珠江新城,广州5号线可以一线直达,但是获取最短路径时会发现起点广州火车站就需要换乘,到达珠江新城时也要换乘,从而令换乘时间的计算出现了一定误差
01
不知如何在代码中避免出现类似情况呢,非常感谢!
(我个人在多次尝试后发现,换乘站选择基本按照地铁线路编号由小到大的顺序选取,也就是说一个换乘站如果有2号线和5号线,那么换乘站都会以2号线作为起点而不会以5号线作为起点)

tbd.data_summary中 col参数的设置问题

当执行如下代码:
tbd.data_summary(df, col=['track_id', 'time', 'lon', 'lat'], show_sample_duration=True)
出现如下报错:
ValueError: too many values to unpack (expected 2)
image

这与文档描述不一致,虽然这貌似是一个小问题

另外,针对采样间隔极低的轨迹数据,其显示的也有问题,应该是小数点省略的太多了
image

这貌似也是个小问题

六边形栅格没有布满bounds区域

大佬您好
我使用您的库时发现

当我设置栅格为六边形hexa时,如 (2),比如经纬度为 104.1545..., 30.8061...(这是在 self.bounds 范围内的,如 (1)),它对应的栅格id为 (21, -13, -34) ,这运行 (3) 得出。然后我发现这个id并不在 self.grid 范围中

self.bounds = [103.90, 30.52, 104.26, 30.81]   (1)
self.grid, self.params = tbd.area_to_grid(self.bounds, accuracy=1200, method='hexa')   (2)
tbd.GPS_to_grid(104.15457922898818, 30.80613863513823, self.params)   (3)

请问大佬这个六边形栅格是不是没有布满整个bounds区域

公交下载报错

有时候出现No such busline
有时候出现
ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

requirement 文件中缺少 requests 包

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]

Smartphone (please complete the following information):

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]
  • Browser [e.g. stock browser, safari]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

ckdnearest_point新增参数exclusive

当gdfA=gdfB时,使用ckdnearest_point会匹配自己

geopandas的sjoin_nearest加入了参数exclusive,当为True时可以不匹配自己

期待ckdnearest_point加入这个功能🙂

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.