Jump to content

QuickCode

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Intgr (talk | contribs) at 19:44, 9 December 2010 (Created page with '{{use dmy dates}} '''ScraperWiki''' is a website for collaboratively building programs to extract and analyze public (online) data, in a wiki-like fashion. "Scr...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

ScraperWiki is a website for collaboratively building programs to extract and analyze public (online) data, in a wiki-like fashion. "Scraper" refers to screen scrapers, programs that extract data from websites. "Wiki" means that any user with programming experience can create or edit such programs for extracting new data, or for analyzing existing datasets.[1] The main use of the website is providing a place for programmers and journalists to collaborate on analyzing public data.[2][3][4][5][6]

References

  1. ^ Jamie Arnold (1 December 2009). "4iP invests in ScraperWiki". 4iP.
  2. ^ Cian Ginty (19 November 2010). "Hacks and hackers unite to get solid stories from difficult data". The Irish Times.
  3. ^ Paul Bradshaw (7 July 2010). "An introduction to data scraping with Scraperwiki". Online Journalism Blog.
  4. ^ Charles Arthur (22 November 2010). "Analysing data is the future for journalists, says Tim Berners-Lee". The Guardian.
  5. ^ Deirdre McArdle (19 November 2010). "In The Papers 19 November". ENN.
  6. ^ "Journalists and developers join forces for Lichfield 'hack day'". The Lichfield Blog. 15 November 2010.