licheng5625 / python-wikitools Goto Github PK
View Code? Open in Web Editor NEWAutomatically exported from code.google.com/p/python-wikitools
Automatically exported from code.google.com/p/python-wikitools
Link to project: WikiProject Astronomy
Link to assessment page:
Preferred page name: Wikipedia:WikiProject Astronomy/Popular pages
"Wikipedia:WikiProject <project>/Popular pages" is the default.
Original issue reported on code.google.com by [email protected]
on 16 Dec 2008 at 4:13
What steps will reproduce the problem?
1. Make a test script
#!/usr/bin/python
# -*- coding: UTF-8 -*-
from wikitools import wiki
from wikitools import api
# create a Wiki object
site = wiki.Wiki("http://www.wikitau.org/api.php")
# define the params for the query
params = {'action':'query', 'titles':'Accueil'}
# create the request object
request = api.APIRequest(site, params)
# query the API
result = request.query()
2. run it :
./test.py
What is the expected output?
I supose I would get the content of the Main page
What do you see instead?
File "/usr/local/lib/python2.7/dist-packages/wikitools/wiki.py", line 109, in setSiteinfo
setattr(self, attr, Namespace(ns))
UnicodeEncodeError: 'ascii' codec can't encode character u'\xc9' in position
15: ordinal not in range(128)
What version of the product are you using? On what operating system?
1.1.1
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 26 May 2011 at 1:33
Short explanation of the problem:
In the docstring of File.__init__, "pageid" is listed as a parameter. However,
it is not.
Expected behavior:
File.__init__ should accept "pageid" as a parameter and pass it to
page.Page.__init__ as the last parameter
Original issue reported on code.google.com by [email protected]
on 15 Jun 2010 at 3:41
popularity3.py needs a command line option to create a table for a single
project. Right now this requires hacking the source and creating a second
config table, which is a PITA. This will probably require splitting
makeResults() into 2 functions - one to make the table itself given a
project and a date, and one to iterate over all the projects as it does now.
Original issue reported on code.google.com by [email protected]
on 5 Feb 2010 at 8:32
r327 needs to be applied to the 1.1 branch and a new release made. Since
1.0 will be deprecated soon, no new release is necessary, though a patch
that cleanly applies would be nice.
Original issue reported on code.google.com by [email protected]
on 7 Apr 2010 at 1:55
Short explanation of the problem:
Currently the cookie jar file is created world-readable. This can cause
security problems, as the authentication cookie is stored in the jar and
therefore can be re-used by other users.
Expected behavior:
The cookie jar file should be mode 0600.
This should easily be fixable by changing the open() call in
wiki.py/WikiCookieJar/save to:
f = open(filename, 'w', 0600)
Original issue reported on code.google.com by [email protected]
on 26 Jul 2009 at 11:19
When the TFA contains an {{as of}} template, breaklines() gets stuck in an
infinite loop caused by the hidden markup left by expanding the template.
Original issue reported on code.google.com by [email protected]
on 10 May 2009 at 2:05
login and possibly other functions that change maxlag can overwrite custom
maxlag settings if the custom setting is the same as whatever the function
changes it to. This probably needs an extra var in Wiki.
Original issue reported on code.google.com by [email protected]
on 13 Feb 2009 at 3:02
What steps will reproduce the problem?
1. Import wikitools and login
2. Verify that everything is working by searching for "Main Page":
>>> params = {'action':'query', 'titles':'Main Page'}
>>> request = wikitools.api.APIRequest(a, params)
>>> result = request.query()
>>> result
{u'query': {u'pages': {u'1': {u'ns': 0, u'pageid': 1, u'title': u'Main Page'}}}}
3. Attempt to edit a page:
>>> params = {'action':'edit', 'title':'Sandbox', "text":"Hello, API"}
>>> request = wikitools.api.APIRequest(a, params)
>>> result = request.query()
^[[ATraceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/wikitools/api.py", line 143, in query
raise APIError(data['error']['code'], data['error']['info'])
wikitools.api.APIError: (u'notoken', u'The token parameter must be set')
What is the expected output? What do you see instead?
Expected: The page should be edited
Actual: wikitools.api.APIError: (u'notoken', u'The token parameter must be set')
What version of the product are you using? On what operating system?
1.1.1 (i believe. I installed it today with easy_install) on Ubuntu 11.10
Please provide any additional information below.
Perhaps I'm doing something wrong, but there doesn't seem to be anything in any
documentation about this.
Original issue reported on code.google.com by [email protected]
on 5 Mar 2012 at 10:34
What steps will reproduce the problem?
I'm using mediawiki behind a private server with an .htaccess file inside
mediawiki source directory. When I try to use wikitools I get the following
error:
>>> from wikitools import wiki
>>> from wikitools import api
>>> site = wiki.Wiki("http://myserver/mw/api.php")
HTTPError: HTTP Error 401: Authorization Required trying request again in 5
seconds
HTTPError: HTTP Error 401: Authorization Required trying request again in 10
seconds
^CTraceback (most recent call last):
File "<stdin>", line 1, in <module>
File "wikitools/wiki.py", line 79, in __init__
self.setSiteinfo()
File "wikitools/wiki.py", line 97, in setSiteinfo
info = req.query()
File "wikitools/api.py", line 139, in query
rawdata = self.__getRaw()
File "wikitools/api.py", line 224, in __getRaw
time.sleep(self.sleep+0.5)
KeyboardInterrupt
>>>
What is the expected output? What do you see instead?
Is there any way to specify an username/password? I think I could modify the
code when urllib is called but I would like to know if there is any login
mechanism. It's the first time I'm using mw clients to automatize tasks. Any
help is welcome. Thanks.
What version of the product are you using? On what operating system?
wikitools-1.1.1
Python 2.6.6
debian 6.0
Original issue reported on code.google.com by [email protected]
on 5 May 2011 at 2:21
The following occurs when using a string list of File page names to create a
list of page objects:
File "C:\Python26\lib\site-packages\wikitools\pagelist.py", line 157, in makePage
item = wikifile.File(site, title=title, check=False, followRedir=False, pageid=key)
TypeError: __init__() got an unexpected keyword argument 'pageid'
In wikifile.py, 'File' does not ask for this argument.
What steps will reproduce the problem?
1. Import wikitools, define a wiki object.
2. The below will produce the unexpected keyword error from above:
a = pagelist.listFromTitles(wiki, [ 'File:Example file.png' ])
What version of the product are you using? On what operating system?
python-wikitools 1.1.1. Python 2.6.6, Windows 7 64-bit.
Removing 'pageid=key' from Line 157 prevents the error and produces correct
file objects with the listFromTitles() function, but I am sceptical as to
whether this is a viable solution.
Original issue reported on code.google.com by [email protected]
on 28 Oct 2011 at 8:50
I'm trying to use the default value None for the attribute exists.
The value is set to True or False in page.setPageInfo().
So we can know if the check has been done.
It allows several checks
if page.exists:
...
if page.exists is True:
...
(the two above are equivalent
but not the tree below)
if page.exists is None: # not checked : the page would be created
...
if page.exists is False: # checked but missing : the page would be modified
...
if not page.exists: # not checked or missing
...
Original issue reported on code.google.com by [email protected]
on 8 Jun 2011 at 7:17
Link to project:
Wikipedia:Wikiproject Germany
Link to assessment page:
Wikipedia:Version 1.0 Editorial Team/Germany articles by quality statistics
Preferred page name:
"Wikipedia:WikiProject Germany/Popular pages" is the default.
Thanks!
Original issue reported on code.google.com by [email protected]
on 7 Jun 2009 at 3:14
For some reason the bot doesn't always start completely. Sometimes it dies
before joining the IRC channel, sometimes it joins but the log watcher
thread doesn't start.
Original issue reported on code.google.com by [email protected]
on 18 Jan 2010 at 8:15
Hi,
I attached a patch that enables you to work with UTF8 strings in namespaces (as
in the fr: wiki)
Thanks for the great lib!
Original issue reported on code.google.com by [email protected]
on 2 Jul 2010 at 11:59
Attachments:
Link to project: Wikipedia:WikiProject Solar System
Link to assessment page:
Preferred page name: Wikipedia:WikiProject Solar System/Popular pages
Original issue reported on code.google.com by [email protected]
on 23 Jan 2009 at 11:54
Link to project:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Numismatics
Link to assessment page:
http://en.wikipedia.org/wiki/Wikipedia:Version_1.0_Editorial_Team/Numismatic_art
icles_by_quality_statistics
Preferred page name:
Wikipedia:WikiProject Numismatics/Popular pages
Original issue reported on code.google.com by [email protected]
on 12 May 2009 at 12:43
Hello, first of all thanks for such a great and usefull library. Most of
the mediawiki api bindings are either unfinished or unsupported, or even
both :)
I'm writing a bot, and i found very inconvinient, that there's no way of
providing extended arguments to most of the api methods. F.ex.
Page.getLinks(), i needed to get the links for a certain namespace, and the
only option i got is to write my own page wrapper. That's a bit disappointing.
Anyway, here's a small patch, making the namespaces look a little more human:
>>> from wikitools import Wiki, Page
>>> site = Wiki()
>>> page = Page(site, "Main Page")
>>> page.getLinks(plnamespace=site.NS_USER)
[u'User:Torsodog']
>>> page.getLinks(force=True, plnamespace=site.NS_PORTAL, pllimit=2)
[u'Portal:Arts', u'Portal:Biography']
>>> page.getLinks(force=True, plnamespace=site.NS_PORTAL | site.NS_USER,
pllimit=2)
[u'User:Torsodog', u'Portal:Arts']
Cheers,
Sergei.
PS. I'm using a stable version (1.0).
Original issue reported on code.google.com by superbobry
on 18 Jan 2010 at 12:42
Attachments:
More projects to add for June:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Molecular_and_Cellular_Biolog
y
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Mathematics
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Physics
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Chemistry
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Maryland
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Chicago
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_The_Simpsons
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Ice_Hockey
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_NASCAR
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Motorsport
Original issue reported on code.google.com by [email protected]
on 8 Jun 2009 at 3:59
User.__eq__ and User.__ne__ are broken. "other.wiki" should be "other.page"
Original issue reported on code.google.com by [email protected]
on 17 Jun 2010 at 11:11
Short explanation of the problem:
Once a read-only file is opened and prepared to be uploaded, software stalls
>>> wfile.upload(fileobj=tmp,ignorewarnings=True)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.6/site-packages/wikitools/wikifile.py", line 228, in upload
res = req.query()
File "/usr/local/lib/python2.6/site-packages/wikitools/api.py", line 139, in query
rawdata = self.__getRaw()
File "/usr/local/lib/python2.6/site-packages/wikitools/api.py", line 214, in __getRaw
data = self.opener.open(self.request)
File "/usr/local/lib/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/local/lib/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/usr/local/lib/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/local/lib/python2.6/urllib2.py", line 1161, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/local/lib/python2.6/urllib2.py", line 1134, in do_open
r = h.getresponse()
File "/usr/local/lib/python2.6/httplib.py", line 986, in getresponse
response.begin()
File "/usr/local/lib/python2.6/httplib.py", line 391, in begin
version, status, reason = self._read_status()
File "/usr/local/lib/python2.6/httplib.py", line 349, in _read_status
line = self.fp.readline()
File "/usr/local/lib/python2.6/socket.py", line 397, in readline
data = recv(1)
KeyboardInterrupt
Expected behavior:
I would expect to either get an error, which happens when the file isn't
prepared correctly, or a success response.
Diffs of the problem (if applicable):
Original issue reported on code.google.com by [email protected]
on 25 Jun 2010 at 8:42
For some reason, when June started, the bot kept getting killed for using too
much memory, the pagelists need to be smaller. This needs to be done before
July, as fixing them in place was annoying
Original issue reported on code.google.com by [email protected]
on 13 Jun 2010 at 10:18
The import API requires XML dumps to be uploaded, so they can be imported.
The attached patch adds file upload support with the help of the 'poster'
library (available from PyPI).
Original issue reported on code.google.com by [email protected]
on 19 Jul 2009 at 5:35
Attachments:
For example:
http://en.wikipedia.org/w/api.php?action=query&generator=categorymembers&gcmtitl
e=Category:Physics&prop=links&plcontinue=21397357|0|Mathematical
model&llcontinue=19411913|th&gcmcontinue=Acoustic%20contrast%20factor|
If the plcontinue is removed, the query returns more links for pages not
previously listed from the last gcmcontinue.
Not sure if this was always broken, or a new breakage related to the recent
changes to the API, could be an API bug, possibly related to pageids not
being returned in order, need to re-read the ML discussion.
Original issue reported on code.google.com by [email protected]
on 18 Feb 2009 at 7:06
Link to project:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject Medicine
Link to assessment page:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Medicine/Assessment
Preferred page name:
"Wikipedia:WikiProject Medicine/Popular pages" is the default.
Original issue reported on code.google.com by [email protected]
on 17 Apr 2009 at 7:00
Link to project:
Wikipedia:WikiProject Sharks
Link to assessment page:
Wikipedia:Version 1.0 Editorial Team/Shark articles by quality statistics
Preferred page name:
"Wikipedia:WikiProject Sharsks/Popular pages" is the default.
Original issue reported on code.google.com by [email protected]
on 10 Jun 2009 at 2:57
Certain attributes are being used for comparisons which means they shouldn't be
able to change. Making wiki.apibase immutable should be simple, but it will
require some trickery to make titles and/or pageids immutable without losing
significant functionality like the ability to create a page object without an
API query.
Original issue reported on code.google.com by [email protected]
on 8 Jun 2011 at 11:02
I'm getting pages from multiple categories and would like a way to make
sure I don't get any duplicates back. The logical way would be to put all
pages into a set() and let the set do the work for me. But since Page does
not have __hash__ defined...
p1 = page.Page(title="Sharks")
p2 = page.Page(title="Sharks")
...are treated as two different pages. Defining a __hash__ on Page would
make things a little easier for me.
Thanks for a great library!
Original issue reported on code.google.com by EmilStenstrom
on 6 Jun 2010 at 8:51
I create a query to get the redirect links in a page using the following:
params = {'action':'query',
'generator':'links',
'gpllimit':'10',
'pageids':pageid,
'redirects':''
}
req = api.APIRequest(globalsite, params)
res = req.query(querycontinue=True)
This works fine as long as the page has any links, but it gets in an endless
loop in api.py if there are no links in the page.
Basically the problem is the following lines. The data is initially false, and
following the call to __parseJSON it is an empty list []. Because and empty
list counts as true for the loop, this continues forever.
while not data:
rawdata = self.__getRaw()
data = self.__parseJSON(rawdata)
Version is relatively recent - sorry, can't tell you exactly what
Original issue reported on code.google.com by [email protected]
on 25 Jul 2012 at 4:44
What steps will reproduce the problem?
I log into site then try to create article using the following code
newarticle = page.Page(site,title="Sandbox Test")
newarticle.edit(title="Sandbox Test",text="This is some text
text",createonly=True,summary="test summary for use of api to create pages")
What is the expected output? What do you see instead?
On one test site (MW 1.14) The new page is created first time as "api.php" (a
wiki page). I can't force it to accept my title.
On another site it tells me I have an edit rather than create token:
[u'read', u'createpage', u'createtalk', u'writeapi', u'ns104_read',
u'ns105_read', u'ns105_edit', u'ns105_create', u'ns106_read', u'ns108_read']
Sandbox Test
Traceback (most recent call last):
File "D:\Documents\PythonScripts\wikitools\ForumNokia\PageDownloader\createpage.py", line 73, in <module>
newarticle.edit(title="Sandbox Test",text="This is some text text",createonly=True,summary="test summary for use of api to create pages")
File "C:\Python25\lib\site-packages\wikitools\page.py", line 520, in edit
token = self.getToken('edit')
File "C:\Python25\lib\site-packages\wikitools\page.py", line 713, in getToken
token = response['query']['pages'][str(self.pageid)][type+'token']
KeyError: 'edittoken'
What version of the product are you using? On what operating system?
I'm using windows. I'm not sure which version of wiki tools. Expect that it was
gathered mid year though.
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 16 Dec 2010 at 3:41
On Wikis with Extension:LDAP Authentication there is another field required
for logging in, called "domain".
Please see the attached file for a patch which applies against 1.0 and should
work fine against "standard" wikis.
Original issue reported on code.google.com by [email protected]
on 19 Jul 2009 at 5:22
Attachments:
Link to project:
Wikipedia:WikiProject The Simpsons
Link to assessment page:
Wikipedia:Version 1.0 Editorial Team/The Simpsons articles by quality
statistics
Preferred page name:
"Wikipedia:WikiProject The Simpsons/Popular pages" is the default.
Thanks!
Original issue reported on code.google.com by [email protected]
on 25 May 2009 at 10:38
What steps will reproduce the problem?
1. python
2. from wikitools import wiki
3. from wikitools import api
4. site = wiki.Wiki("https://appel.ugent.be/userwiki/index.php/Main_Page")
What is the expected output? What do you see instead?
expected output: something like "Yeeha! wikipage loaded!"
instead:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "wikitools/wiki.py", line 79, in __init__
self.setSiteinfo()
File "wikitools/wiki.py", line 97, in setSiteinfo
info = req.query()
File "wikitools/api.py", line 140, in query
data = self.__parseJSON(rawdata)
File "wikitools/api.py", line 254, in __parseJSON
data.seek(0)
AttributeError: addinfourl instance has no attribute 'seek'
What version of the product are you using? On what operating system?
wikitools-1.1.1
Linux 2.6.31-22-generic Ubuntu (Ubuntu 9.10 - the Karmic Koala)
Please provide any additional information below.
Only the main page of this userwiki is accessible without login
Original issue reported on code.google.com by [email protected]
on 2 Dec 2010 at 2:07
What steps will reproduce the problem?
>>> import wikitools
>>> w = wikitools.Wiki()
>>> u = wikitools.User(w,"2001:DB8:0:0:0:0:0:0")
>>> u.isIP
False
What is the expected output? What do you see instead?
I expect "True".
What version of the product are you using? On what operating system?
I'm using wikitools 1.1.1 and Windows Vista.
Original issue reported on code.google.com by [email protected]
on 12 Feb 2012 at 4:03
Attached patch
Original issue reported on code.google.com by [email protected]
on 4 Feb 2012 at 3:50
Attachments:
Short explanation of the problem:
The function APIRequest.changeParam() relies on the undefined variable "wiki."
I get the following error message when I try to log in.
> Traceback (most recent call last):
(some lines skipped)
> File "C:\Python26\lib\site-packages\wikitools\wiki.py", line 142, in login
> req.changeParam('lgtoken', info['login']['token'])
> File "C:\Python26\lib\site-packages\wikitools\api.py", line 76, in
changeParam
> self.request = urllib2.Request(wiki.apibase, self.encodeddata,
self.headers)
> NameError: global name 'wiki' is not defined
Expected behavior:
"self.wiki" should be used. No error should be occur.
Diffs of the problem (if applicable):
Original issue reported on code.google.com by [email protected]
on 6 May 2010 at 2:18
The script should raise an error when trying to edit a page while blocked.
Currently action=edit gives a nice informative error message/code
wikitools.api.APIError: (u'blocked', u'You have been blocked from editing')
but action=delete, and probably others, don't.
Original issue reported on code.google.com by [email protected]
on 6 Mar 2009 at 7:40
Reported by email on 2/7:
Thanks for putting together this library. I noticed that query=opensearch
breaks, because you use the default __init__ from dict for APIResult, and
query=opensearch returns a list. All I do here is check if the data is a
list, and if so, I turn it into a dictionary in a very naive way.
Certainly could be done better, but hey, it's 2am :D.
--- python-wikitools-read-only/api.py 2010-02-07 00:47:35.000000000 -0500
+++ wikitools-dev-wrapper/wikitools/api.py 2010-02-07 01:25:38.000000000
-0500
@@ -230,7 +230,12 @@
while maxlag:
try:
maxlag = False
- content = APIResult(json.loads(data.read()))
+ json_data = json.loads(data.read())
+ if isinstance(json_data,list):
+ #if here, then query=opensearch
+ json_data = { 'query' : json_data[0],
+ 'results' : json_data[1] }
+ content = APIResult(json_data)
content.response = self.response.items()
if 'error' in content:
error = content['error']['code']
Original issue reported on code.google.com by [email protected]
on 15 Feb 2010 at 12:52
http://www.mediawiki.org/wiki/Special:Code/MediaWiki/52190 changed the
behavior of the maxlag errors to return a 503 status code, which will cause
Python to raise an exception if its not specially handled by the opener
object: http://docs.python.org/library/urllib2.html#openerdirector-objects
Original issue reported on code.google.com by [email protected]
on 20 Jun 2009 at 3:16
Link to project:
[[Wikipedia:WikiProject Olympics]]
Link to assessment page:
[[Wikipedia:Version 1.0 Editorial Team/Olympics articles by quality
statistics]]
Preferred page name:
Wikipedia:WikiProject Olympics/Popular pages
Original issue reported on code.google.com by [email protected]
on 12 May 2009 at 12:40
getLinks(), getTemplates(), and getCategories() just return plain lists of
titles. They should be rewritten in the form of Category.getAllMembers() to
be able to return objects.
Original issue reported on code.google.com by [email protected]
on 5 Aug 2009 at 4:29
What steps will reproduce the problem?
1. try to edit a wiki page that enables captchas
2.
3.
What is the expected output? What do you see instead?
you cannot edit the page, because wikitools strip "captchaword" and "captchid"
from the argument list in the edit call.
What version of the product are you using? On what operating system?
Please provide any additional information below.
The fix is simple. Just don't strip arguments, or, if you really want to, don't
strip known arguments (take from http://www.mediawiki.org/wiki/API:Edit)
Original issue reported on code.google.com by [email protected]
on 21 Oct 2011 at 5:38
Link to project:
Wikipedia:WikiProject Awards and prizes
Link to assessment page:
Wikipedia:Version 1.0 Editorial Team/Awards articles by quality statistics
Preferred page name:
"Wikipedia:WikiProject Awards and prizes/Popular pages" is the default.
Thanks!
Original issue reported on code.google.com by [email protected]
on 12 Jun 2009 at 9:19
Link to project:
http://en.wikipedia.org/wiki/Wikipedia:WikiProject_Indiana
Link to assessment page:
http://en.wikipedia.org/wiki/Wikipedia:Version_1.0_Editorial_Team/Indiana_articl
es_by_quality_statistics
Preferred page name:
Wikipedia:WikiProject Indiana/Popular pages
Original issue reported on code.google.com by [email protected]
on 12 May 2009 at 12:42
What steps will reproduce the problem?
1. Run example code given in README with site =
wiki.Wiki(http://en.wikipedia.org/w/php)
2.
3.
What is the expected output? What do you see instead?
the expected output in README
I get TypeError: sequence item 0: expected string, int found
What version of the product are you using? On what operating system?
1.1.1 wikitools
2.7 python
linux
Please provide any additional information below.
Original issue reported on code.google.com by [email protected]
on 15 Aug 2011 at 10:18
Link to project: Wikipedia:WikiProject Astronomical objects
Link to assessment page:
Preferred page name: Wikipedia:WikiProject Astronomical objects/Popular pages
Original issue reported on code.google.com by [email protected]
on 23 Jan 2009 at 11:56
Wikitools 2.0, to include any major breaking changes
Original issue reported on code.google.com by [email protected]
on 8 Jun 2011 at 10:41
Link to project:
Wikipedia:WikiProject Indianapolis
Link to assessment page:
Wikipedia:Version 1.0 Editorial Team/Indianapolis articles by quality
statistics
Preferred page name:
"Wikipedia:WikiProject Indianapolis/Popular pages" is the default.
Thanks!!
Original issue reported on code.google.com by [email protected]
on 7 Jun 2009 at 3:11
Link to project:
http://en.wikipedia.org/wiki/WP:TB
Link to assessment page:
http://en.wikipedia.org/wiki/WP:TBA
Preferred page name:
Wikipedia:WikiProject The Beatles/Popular pages
Original issue reported on code.google.com by [email protected]
on 23 Jan 2009 at 7:02
[deleted issue]
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.