type
class label
1 class
id
stringlengths
7
7
subreddit.id
stringclasses
1 value
subreddit.name
stringclasses
1 value
subreddit.nsfw
bool
1 class
created_utc
unknown
permalink
stringlengths
61
109
body
large_stringlengths
0
9.98k
sentiment
float32
-1
1
score
int32
-65
195
1comment
c0nrspd
2r97t
datasets
false
"2010-04-09T03:44:22Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nrspd/
Thanks, I added that book to my list.
0.4404
1
1comment
c0nrpp6
2r97t
datasets
false
"2010-04-09T03:03:54Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nrpp6/
["Programming Collective Intelligence"](http://oreilly.com/catalog/9780596529321) by Toby Segaran.
0
2
1comment
c0nrgey
2r97t
datasets
false
"2010-04-09T00:54:03Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nrgey/
[deleted]
null
3
1comment
c0nrfi8
2r97t
datasets
false
"2010-04-09T00:40:58Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nrfi8/
If you go to a university with a Safari Books subscription (you probably do if you go to a tech school), you can read this online free.
0.5106
3
1comment
c0nrdsq
2r97t
datasets
false
"2010-04-09T00:17:06Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nrdsq/
Is there a way to put them into Stata?
0
2
1comment
c0nrdno
2r97t
datasets
false
"2010-04-09T00:15:08Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nrdno/
http://www.benfry.com/ This is the guy behind Processing and some updated data-visualization stuff. Check out what he does.
0
4
1comment
c0nr396
2r97t
datasets
false
"2010-04-08T22:00:47Z"
https://old.reddit.com/r/datasets/comments/bmn5y/rice_singlepixel_camera_project_zip/c0nr396/
>Please acknowledge the use of this data in publications via a reference to the "Rice Single-Pixel Camera Project, http://dsp.rice.edu/cscamera"
0
1
1comment
c0nr371
2r97t
datasets
false
"2010-04-08T21:59:55Z"
https://old.reddit.com/r/datasets/comments/bmn5y/rice_singlepixel_camera_project_zip/c0nr371/
http://dsp.rice.edu/cscamera From a quick glance, it looks like it's a whole lot of unprocessed image data from a camera that takes pictures one pixel at a time.
0.3612
2
1comment
c0nr31r
2r97t
datasets
false
"2010-04-08T21:57:59Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nr31r/
* Which dataset(s)? * What formats are the datasets available in? * What result do you want at the end? (Another dataset, pretty graphs etc etc)
0.6174
1
1comment
c0nqkli
2r97t
datasets
false
"2010-04-08T18:30:34Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nqkli/
The Collective Intelligence book from O'Reilly goes through a large amount of data analysis stuff using Python and is a rather excellent book overall which I recommend to anyone regardless of their chosen language. If that doesn't get you started, I don't know what would.
0.8519
3
1comment
c0nqcgn
2r97t
datasets
false
"2010-04-08T16:54:41Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nqcgn/
> ... having a well formated table from the start would help loads That's an understatement :). Data will be messy, and personally the combination R+Python (& occasional shell scripting) has served me well when dealing with it. As for the GUI, I'm doubtfull that a GUI being both generic and powerful enough for datacleaning is feasable. I'm curious if I need to retract that claim in the [near future](http://blog.freebase.com/2010/03/26/preview-freebase-gridworks/) though...
0.8176
2
1comment
c0nq6so
2r97t
datasets
false
"2010-04-08T15:45:39Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nq6so/
[deleted]
null
1
1comment
c0nq6gc
2r97t
datasets
false
"2010-04-08T15:41:31Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nq6gc/
[deleted]
null
1
1comment
c0nq6cv
2r97t
datasets
false
"2010-04-08T15:40:25Z"
https://old.reddit.com/r/datasets/comments/bmlcf/the_internet_movie_database/c0nq6cv/
A couple alternatives: themoviedb.org and movielens.org (http://www.grouplens.org/node/73)
0
1
1comment
c0nq0r2
2r97t
datasets
false
"2010-04-08T14:27:57Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0nq0r2/
> ... should be an easier way to load data What exactly is difficult in the loading process? Do you mean the data-cleaning part?
0.168
2
1comment
c0npw3q
2r97t
datasets
false
"2010-04-08T13:09:22Z"
https://old.reddit.com/r/datasets/comments/bo4pn/hey_rdatasets_id_really_like_to_start_working/c0npw3q/
[deleted]
null
2
1comment
c0nngvq
2r97t
datasets
false
"2010-04-07T13:52:16Z"
https://old.reddit.com/r/datasets/comments/bn9j3/statistics_facebook/c0nngvq/
A bit off topic: > Average user has 130 friends on the site. Average user sends 8 friend requests per month Am I the only one that's way behind these numbers.
0.743
1
1comment
c0nn0vq
2r97t
datasets
false
"2010-04-07T06:34:23Z"
https://old.reddit.com/r/datasets/comments/bmypj/geoscience_australia_data_dumps_mapping_gis/c0nn0vq/
Public GIS datasets ftw. Thanks.
null
3
1comment
c0nmpkg
2r97t
datasets
false
"2010-04-07T03:44:49Z"
https://old.reddit.com/r/datasets/comments/bmypj/geoscience_australia_data_dumps_mapping_gis/c0nmpkg/
Writing acquisition and processing software, setting up field procedures, doing field work, doing data processing, overseeing processing, planning large area coverage, having final say on datasets submitted to industry clients and cc'd to Geosciences Australia. The Northern Territory led the way in making their data sets public, G.A. has since followed suit for all of Australia. I was associated for a decade with a now sold out private company that acquired and processed this data for the exploration industry.
0.2263
4
1comment
c0nml0q
2r97t
datasets
false
"2010-04-07T02:49:08Z"
https://old.reddit.com/r/datasets/comments/bmypj/geoscience_australia_data_dumps_mapping_gis/c0nml0q/
At the dataset stage or the field work stage?
0
2
1comment
c0nlzk9
2r97t
datasets
false
"2010-04-06T21:53:21Z"
https://old.reddit.com/r/datasets/comments/bn94g/us_states_ranked_by_carbon_emission_size/c0nlzk9/
See my analysis [Do Red or Blue States Emit More Carbon Dioxide? ](http://citizenjosh.com/blog/1-do-red-states-emit-more-carbon-dioxide.html)
0
1
1comment
c0nlojb
2r97t
datasets
false
"2010-04-06T19:42:34Z"
https://old.reddit.com/r/datasets/comments/bn94g/us_states_ranked_by_carbon_emission_size/c0nlojb/
If you divide population by CO2, you get some interesting results: aside from DC, which has a lot of traffic & population in a small area, the outliers are Vermont, Rhode Island, and Idaho. What are they doing there to ratchet up CO2 emission?
0.4019
1
1comment
c0nkrpm
2r97t
datasets
false
"2010-04-06T11:56:27Z"
https://old.reddit.com/r/datasets/comments/bmqmu/lots_of_open_datasets_to_play_with_at_ropendata/c0nkrpm/
Thanks. Being a data person I'd like to merge the two subreddits, but I can see why you'd keep them separate, too.
0.4019
1
1comment
c0nknkl
2r97t
datasets
false
"2010-04-06T09:03:06Z"
https://old.reddit.com/r/datasets/comments/bn16w/eurostat_statistical_data_of_europe/c0nknkl/
It's got a lot of data available as [publications](http://epp.eurostat.ec.europa.eu/portal/page/portal/publications/recently_published), [browsable statistics](http://epp.eurostat.ec.europa.eu/portal/page/portal/statistics/themes) and [bulk download](http://epp.eurostat.ec.europa.eu/portal/page/portal/statistics/bulk_download)
0
1
1comment
c0nkmg1
2r97t
datasets
false
"2010-04-06T08:19:00Z"
https://old.reddit.com/r/datasets/comments/bmzjb/automatically_assessing_the_post_quality_in/c0nkmg1/
You have a cake next to your name.
0
1
1comment
c0nkjli
2r97t
datasets
false
"2010-04-06T06:58:13Z"
https://old.reddit.com/r/datasets/comments/bmnt3/all_wikimedia_downloads_database_dumps_static/c0nkjli/
I promise I won't make a joke about how well liberal bias compresses.
0.276
1
1comment
c0nkj5g
2r97t
datasets
false
"2010-04-06T06:46:49Z"
https://old.reddit.com/r/datasets/comments/bmypj/geoscience_australia_data_dumps_mapping_gis/c0nkj5g/
FYI - I had a hand in the gathering of a significant chunk of the magnetic and radiometric data in these sets a few years back. I'm happy to field what questions I can.
0.8816
3
1comment
c0nka20
2r97t
datasets
false
"2010-04-06T03:56:04Z"
https://old.reddit.com/r/datasets/comments/9s7vp/zip_code_database/c0nka20/
Database seems to be gone. Anyone got a mirror?
0
1
1comment
c0nk7er
2r97t
datasets
false
"2010-04-06T03:14:25Z"
https://old.reddit.com/r/datasets/comments/bmqep/the_google_public_data_explorer_makes_large/c0nk7er/
Why are the [OECD data files](http://oberon.sourceoecd.org/vl=2741047/cl=20/nw=1/rpsv/factbook2009/index.htm) Excel files? Makes my heart sad.
0.6249
1
1comment
c0nk42h
2r97t
datasets
false
"2010-04-06T02:23:23Z"
https://old.reddit.com/r/datasets/comments/bmlol/rdatasets_dataset/c0nk42h/
Cool, so the API is nice enough to let us both go our own way ;)
0.7184
1
1comment
c0njyyf
2r97t
datasets
false
"2010-04-06T01:05:28Z"
https://old.reddit.com/r/datasets/comments/bmlol/rdatasets_dataset/c0njyyf/
I'm currently focusing on the Android system and on that XML support is non-existent while parsing a JSON object is one call.
0.4019
1
1comment
c0njwhw
2r97t
datasets
false
"2010-04-06T00:28:06Z"
https://old.reddit.com/r/datasets/comments/bmlol/rdatasets_dataset/c0njwhw/
With the code I'm working on I honestly find XML to be more useful (XSLT is good)
0.8439
1
1comment
c0njvcv
2r97t
datasets
false
"2010-04-06T00:11:33Z"
https://old.reddit.com/r/datasets/comments/bmjli/dataaustraliagovau_datasets/c0njvcv/
Oh sweet. Have you used any of these yet?
0.4588
1
1comment
c0njuek
2r97t
datasets
false
"2010-04-05T23:57:52Z"
https://old.reddit.com/r/datasets/comments/bmn5y/rice_singlepixel_camera_project_zip/c0njuek/
http://lmgtfy.com/?q=rice+single-pixel+camera&l=1
null
-4
1comment
c0njnyp
2r97t
datasets
false
"2010-04-05T22:27:12Z"
https://old.reddit.com/r/datasets/comments/bmn5y/rice_singlepixel_camera_project_zip/c0njnyp/
What is this? Link to more context, please.
0.3774
2
1comment
c0njasm
2r97t
datasets
false
"2010-04-05T19:39:07Z"
https://old.reddit.com/r/datasets/comments/bmlol/rdatasets_dataset/c0njasm/
This is a revelation. Honestly, if there's one thing I hate about reddit, it's the opinions. What I want is a link bin. I want raw data- I don't want opinions. Thank you for bringing reason to what would otherwise be a circle jerk.
-0.5903
2
1comment
c0nj9oe
2r97t
datasets
false
"2010-04-05T19:26:25Z"
https://old.reddit.com/r/datasets/comments/bmlol/rdatasets_dataset/c0nj9oe/
I'm sorry, did you mean to link to [this?](http://www.reddit.com/r/datasets/.json)
-0.0772
2
1comment
c0nj5ga
2r97t
datasets
false
"2010-04-05T18:39:32Z"
https://old.reddit.com/r/datasets/comments/bmn4z/ask_rdatasets_good_resource_for_citytown_and_zip/c0nj5ga/
[deleted]
null
1
1comment
c0nirrt
2r97t
datasets
false
"2010-04-05T15:57:46Z"
https://old.reddit.com/r/datasets/comments/bmlcf/the_internet_movie_database/c0nirrt/
Other Licensing Details: Minimum Price: We offer data licensing packages that are customized to meet your needs with annual fees ranging from $15,000 to higher depending on the audience for the data and which data are being licensed. We are not able to offer any sort of data license for less than $15,000. Data Delivery: Licensed data is delivered via a web service which provides customers with real-time access to XML structured data. We do not provide static, flat data feeds. In order to access IMDb’s licensed data, you will need to be able to implement a web service by using protocols such as SOAP, REST or Query. Licensing customers usually update their data set once per week, though more frequent updates may be available if required. Restrictions: IMDb does not exchange content or enter into partnerships in exchange for rev-share, cost-per-click, "branding opportunities" or in return for traffic back to IMDb.com. Additionally, IMDb will not authorize our data to be re-distributed or sub-licensed to another third party.
0.6486
1
1comment
c0nio1l
2r97t
datasets
false
"2010-04-05T15:00:51Z"
https://old.reddit.com/r/datasets/comments/bmlcf/the_internet_movie_database/c0nio1l/
Alternative database w/"better" license: http://www.freebase.com/type/schema/film/film
0
2
1comment
c0nin11
2r97t
datasets
false
"2010-04-05T14:44:42Z"
https://old.reddit.com/r/datasets/comments/bmn4z/ask_rdatasets_good_resource_for_citytown_and_zip/c0nin11/
I found this that's turned out to be a great resource for city/town/county information: * http://www.business.gov/about/features/api/geodata/all-data.html Have yet to dig for ZIP codes.
0.6249
1
1comment
c0nilba
2r97t
datasets
false
"2010-04-05T14:13:55Z"
https://old.reddit.com/r/datasets/comments/bmnx6/amazons_aws_block_storage_devices_can_come/c0nilba/
full list here: http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=243
0
1
1comment
c0nijh2
2r97t
datasets
false
"2010-04-05T13:36:22Z"
https://old.reddit.com/r/datasets/comments/bmn4z/ask_rdatasets_good_resource_for_citytown_and_zip/c0nijh2/
This was posted in another thread. http://www.free-zipcodes.com/
0
1
1comment
c0niesp
2r97t
datasets
false
"2010-04-05T10:58:49Z"
https://old.reddit.com/r/datasets/comments/bmlol/rdatasets_dataset/c0niesp/
See http://code.reddit.com/wiki/API and http://code.reddit.com/browser/r2/r2/controllers/api.py
null
3
1comment
c0niekx
2r97t
datasets
false
"2010-04-05T10:46:37Z"
https://old.reddit.com/r/datasets/comments/bmlcf/the_internet_movie_database/c0niekx/
Careful of the licence on this one.
0.1531
2
1comment
c0niejq
2r97t
datasets
false
"2010-04-05T10:45:19Z"
https://old.reddit.com/r/datasets/comments/bmlk0/wikipediamediawiki_api_good_for_querying_any/c0niejq/
Also a great example of exposing an API with very useful help for developers. Note that for bulk datamining there are better dumps of Wikipedia around.
0.8545
1
1comment
c0nic75
2r97t
datasets
false
"2010-04-05T08:50:31Z"
https://old.reddit.com/r/datasets/comments/bmkh4/lbnlicsi_enterprise_tracing_project_packet/c0nic75/
From the overview: "We have collected packet traces that span more than 100 hours of activity from a total of several thousand internal hosts. This wealth of data, which we are publicly releasing in anonymized form, spans a wide range of dimensions. "
0.5789
1
1comment
c0nibg3
2r97t
datasets
false
"2010-04-05T08:18:54Z"
https://old.reddit.com/r/datasets/comments/bmjvh/realtime_access_to_a_lot_of_datasets_through_one/c0nibg3/
https://developer.yahoo.com/yql/console/?q=SELECT%20*%20FROM%20geo.placemaker%20WHERE%20documentContent%20in%20%28select%20text%20from%20twitter.search%20where%20q%3D%22going%20to%22%29&env=store://datatables.org/alltableswithkeys#h=SELECT%20*%20FROM%20geo.placemaker%20WHERE%20documentContent%20in%20%28select%20text%20from%20twitter.search%20where%20q%3D%22going%20to%22%29%20and%20documentType%3D%22text/plain%22%20and%20appid%3D%22%22
null
1