web applications - Download ASPX page with R -
there number of detailed answers on cover authenticated login aspx site , download it. complete n00b haven't been able find simple explanation of how data web form
the following mwe intended example only. , question more intended teach me how wider collection of webpages.
website :
http://data.un.org/data.aspx?d=sna&f=group_code%3a101
what tried , (obviously) failed.
test=read.csv('http://data.un.org/handlers/downloadhandler.ashx?datafilter=group_code:101;country_code:826&datamartid=sna&format=csv&c=2,3,4,6,7,8,9,10,11,12,13&s=_cr_engnameorderby:asc,fiscal_year:desc,_grit_code:asc')
giving me goobledegook view(test)
anything steps me through or points me in right direction gratefully received.
the url accessing using read.csv returning zipped file. download using httr
, write contents temp file:
library(httr) urlun <- "http://data.un.org/handlers/downloadhandler.ashx?datafilter=group_code:101;country_code:826&datamartid=sna&format=csv&c=2,3,4,6,7,8,9,10,11,12,13&s=_cr_engnameorderby:asc,fiscal_year:desc,_grit_code:asc" response <- get(urlun) writebin(content(response, = "raw"), "temp/temp.zip") fname <- unzip("temp/temp.zip", list = true)$name unzip("temp/temp.zip", exdir = "temp") read.csv(paste0("temp/", fname))
alternatively hmisc
has useful getzip
function:
library(hmisc) urlun <- "http://data.un.org/handlers/downloadhandler.ashx?datafilter=group_code:101;country_code:826&datamartid=sna&format=csv&c=2,3,4,6,7,8,9,10,11,12,13&s=_cr_engnameorderby:asc,fiscal_year:desc,_grit_code:asc" undata <- read.csv(getzip(urlun))
Comments
Post a Comment