QuickCode
{{Infobox website
| name = QuickCode
| logo = File:ScraperWiki logo.svg.]]
| screenshot =
| collapsible =
| collapsetext =
| caption =
| url = {{URL|https://quickcode.io/}}
| commercial =
| type =
| language = English
| registration =
| owner =
| author =
| launch_date =
| current_status = Inactive
| revenue = Sponsored by 4iP{{cite web |author=Jamie Arnold |date=2009-12-01 |title=4iP invests in ScraperWiki |publisher=4iP |url=http://www.4ip.org.uk/2009/12/4ip-invests-in-scraperwiki/ }}
| content_license = GNU Affero General Public License{{cite web|url=https://github.com/sensiblecodeio/custard/blob/master/LICENCE|title=GNU Affero General Public License v3.0 - sensiblecodeio|website=GitHub|access-date=30 December 2017}}
}}
QuickCode (formerly ScraperWiki) was a web-based platform for collaboratively building programs to extract and analyze public (online) data, in a wiki-like fashion. "Scraper" refers to screen scrapers, programs that extract data from websites. "Wiki" means that any user with programming experience can create or edit such programs for extracting new data, or for analyzing existing datasets. The main use of the website is providing a place for programmers and journalists to collaborate on analyzing public data.{{cite news |author=Cian Ginty |date=2010-11-19 |title=Hacks and hackers unite to get solid stories from difficult data |publisher=The Irish Times |url=http://www.irishtimes.com/newspaper/finance/2010/1119/1224283709384.html }}{{cite web |author=Paul Bradshaw |date=2010-07-07 |title=An introduction to data scraping with Scraperwiki |publisher=Online Journalism Blog |url=http://onlinejournalismblog.com/2010/07/07/an-introduction-to-data-scraping-with-scraperwiki/ }}{{cite news |author=Charles Arthur |date=2010-11-22 |title=Analysing data is the future for journalists, says Tim Berners-Lee |work=The Guardian |url=https://www.theguardian.com/media/2010/nov/22/data-analysis-tim-berners-lee }}{{cite news |author=Deirdre McArdle |date=2010-11-19 |title=In The Papers 19 November |publisher=ENN |url=http://www.enn.ie/story/show/10125973 }}{{cite web |date=2010-11-15 |title=Journalists and developers join forces for Lichfield 'hack day' |publisher=The Lichfield Blog |url=http://thelichfieldblog.co.uk/2010/11/15/journalists-and-developers-join-forces-for-lichfield-hack-day/ |access-date=2010-12-09 |archive-date=2010-11-24 |archive-url=https://web.archive.org/web/20101124082507/http://thelichfieldblog.co.uk/2010/11/15/journalists-and-developers-join-forces-for-lichfield-hack-day/ |url-status=dead }}{{cite news |author=Alison Spillane |date=2010-11-17 |title=Online tool helps to create greater public data transparency |publisher=Politico |url=http://politico.ie/index.php?option=com_content&view=article&id=6906:online-tool-helps-to-create-greater-public-data-transparency&catid=193:science-tech&Itemid=880 }}
The service was renamed circa 2016, as "it isn't a wiki or just for scraping any more".{{cite web|url=https://scraperwiki.com/|title=ScraperWiki|access-date=7 February 2017}} At the same time, the eponymous parent company was renamed 'The Sensible Code Company'.
History
ScraperWiki was founded in 2009 by Julian Todd and Aidan McGuire. It was initially funded by 4iP, the venture capital arm of TV station Channel 4. Since then, it has attracted an additional £1 Million round of funding from Enterprise Ventures.
Aidan McGuire is the chief executive officer of The Sensible Code Company
See also
References
{{Reflist}}
External links
- {{Official website|https://quickcode.io/}}
- [https://github.com/sensiblecodeio/custard github repository of custard]
Category:Collaborative projects
Category:Social information processing
Category:Mashup (web application hybrid)
Category:Software using the GNU Affero General Public License
{{wiki-stub}}