Dataset Preview
Full Screen Viewer
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError Exception: CastError Message: Couldn't cast package: string license: string license_gh: string to {'package': Value(dtype='string', id=None), 'path': Value(dtype='string', id=None), 'content': Value(dtype='large_string', id=None), 'size': Value(dtype='float64', id=None), 'license': Value(dtype='string', id=None)} because column names don't match Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1492, in compute_config_parquet_and_info_response fill_builder_info(builder, hf_endpoint=hf_endpoint, hf_token=hf_token, validate=validate) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 702, in fill_builder_info num_examples_and_sizes: list[tuple[int, int]] = thread_map( File "/src/services/worker/.venv/lib/python3.9/site-packages/tqdm/contrib/concurrent.py", line 69, in thread_map return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/tqdm/contrib/concurrent.py", line 51, in _executor_map return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs)) File "/src/services/worker/.venv/lib/python3.9/site-packages/tqdm/std.py", line 1169, in __iter__ for obj in iterable: File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 609, in result_iterator yield fs.pop().result() File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 446, in result return self.__get_result() File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result raise self._exception File "/usr/local/lib/python3.9/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 574, in retry_validate_get_num_examples_and_size validate(pf) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 640, in validate raise TooBigRowGroupsError( worker.job_runners.config.parquet_and_info.TooBigRowGroupsError: Parquet file has too big row groups. First row group has 350476440 which exceeds the limit of 300000000 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1995, in _prepare_split_single for _, table in generator: File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 797, in wrapped for item in generator(*args, **kwargs): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 97, in _generate_tables yield f"{file_idx}_{batch_idx}", self._cast_table(pa_table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 75, in _cast_table pa_table = table_cast(pa_table, self.info.features.arrow_schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast return cast_table_to_schema(table, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2256, in cast_table_to_schema raise CastError( datasets.table.CastError: Couldn't cast package: string license: string license_gh: string to {'package': Value(dtype='string', id=None), 'path': Value(dtype='string', id=None), 'content': Value(dtype='large_string', id=None), 'size': Value(dtype='float64', id=None), 'license': Value(dtype='string', id=None)} because column names don't match The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1505, in compute_config_parquet_and_info_response parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet( File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1099, in stream_convert_to_parquet builder._prepare_split( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
package
string | path
string | content
large_string | size
float64 | license
string |
---|---|---|---|---|
igraphdata | cran-igraphdata-deedc8a/inst/getdata.R |
# igraphdata R package
# Copyright (C) 2010-2012 Gabor Csardi <csardi.gabor@gmail.com>
# 334 Harvard st, 02139 Cambridge, MA, USA
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
# 02110-1301 USA
#
###################################################################
library(igraph)
tmp <- tempdir()
#####################################################################
## Foodwebs
u1 <- "http://vlado.fmf.uni-lj.si/pub/networks/data/bio/FoodWeb/Webs_paj.zip"
u2 <- "http://vlado.fmf.uni-lj.si/pub/networks/data/bio/FoodWeb/ATLSS_paj.zip"
foodzip <- paste(tmp, sep="/", c("f1.zip", "f2.zip"))
download.file(url=u1, destfile=foodzip[1])
download.file(url=u2, destfile=foodzip[2])
unlink(paste(tmp, sep="/", "paj"), recursive=TRUE)
system(paste("cd", tmp, ";", "unzip", foodzip[1]))
system(paste("cd", tmp, ";", "unzip", foodzip[2]))
system(paste("cd", tmp, ";", "mv *.paj paj/"))
pajfiles <- list.files(paste(tmp, sep="/", "paj"), full.names=TRUE)
readpaj <- function(filename) {
lines <- readLines(filename)
lines <- grep("^%", lines, invert=TRUE, value=TRUE) # comments
lines <- grep("^[ \t]*$", lines, invert=TRUE, value=TRUE) # empty lines
eco <- lines[grep("^\\*partitio", lines)[1]:
(grep("^\\*network", lines)[1]-1)]
net <- lines[grep("^\\*network", lines)[1]:(grep("^\\*vector", lines)[1]-1)]
bim <- lines[grep("^\\*vector", lines)[1]:length(lines)]
tf <- tempfile()
cat(net, file=tf, sep="\n")
G <- read_graph(tf, format="pajek")
V(G)$name <- V(G)$id
G <- delete_vertex_attr(G, "id")
V(G)$ECO <- as.numeric(eco[-(1:2)])
V(G)$Biomass <- as.numeric(bim[-(1:2)])
G
}
foodwebs <- lapply(pajfiles, readpaj)
names(foodwebs) <- sub("\\.paj$", "", basename(pajfiles))
foodwebs <- foodwebs[setdiff(names(foodwebs), c("Everglades", "Florida"))]
authors <- c("ChesLower", "Hagy, J.D.",
"ChesMiddle", "Hagy, J.D.",
"ChesUpper", "Hagy, J.D.",
"Chesapeake", "Baird, D. and R.E. Ulanowicz",
"CrystalC", "Homer, M. and W.M. Kemp",
"CryslalD", "Homer, M. and W.M. Kemp",
"Maspalomas",
"Almunia, J., G. Basterretxea, J. Aristegui, and R.E. Ulanowicz",
"Michigan", "Krause, A. and D. Mason",
"Mondego", "Patricio, J.",
"Narragan", "Monaco, M.E. and R.E. Ulanowicz",
"Rhode", "Correll, D",
"StMarks", "Baird, D., J. Luczkovich and R. R. Christian",
"baydry",
"Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich",
"baywet",
"Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich",
"cypdry",
"Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich",
"cypwet",
"Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich",
"gramdry",
"Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich",
"gramwet",
"Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich",
"mangdry",
"Ulanowicz, R. E., C. Bondavalli, J. J. Heymans, and M. S. Egnotovich",
"mangwet",
"Ulanowicz, R. E., C. Bondavalli, J. J. Heymans, and M. S. Egnotovich"
)
Authors <- matrix(authors, nc=2, byrow=TRUE)
citations <-
"ChesLower,ChesMiddle,ChesUpper| Hagy, J.D. (2002) Eutrophication, hypoxia
and trophic transfer efficiency in Chesapeake Bay PhD
Dissertation, University of Maryland at College
Park (USA), 446 pp.
Chesapeake| Baird D. & Ulanowicz R.E. (1989) The seasonal dynamics
of the Chesapeake Bay ecosystem. Ecological Monographs
59:329-364.
CrystalC,CrystalD| Homer, M. and W.M. Kemp. Unpublished Ms. See also
Ulanowicz, R.E. 1986. Growth and Development:
Ecosystems Phenomenology. Springer, New York. pp 69-79.
Maspalomas| Almunia, J., G. Basterretxea, J. Aristegui, and R.E.
Ulanowicz. (1999) Benthic- Pelagic switching in a coastal
subtropical lagoon. Estuarine, Coastal and Shelf
Science 49:363-384.
Michigan| Krause, A. and D. Mason. (In preparation.) A. Krause,
PhD. Dissertation, Michigan State University.
Ann Arbor, MI. USA
Mondego| Patricio, J. (In Preparation) Master's Thesis.
University of Coimbra, Coimbra, Portugal.
Narragan| Monaco, M.E. and R.E. Ulanowicz. (1997) Comparative
ecosystem trophic structure of three U.S. Mid-Atlantic
estuaries. Mar. Ecol. Prog. Ser. 161:239-254.
Rhode| Correll, D. (Unpublished manuscript) Smithsonian
Institute, Chesapeake Bay Center for Environmental
Research, Edgewater, Maryland 21037-0028 USA.
StMarks| Baird, D., J. Luczkovich and R. R. Christian. (1998)
Assessment of spatial and temporal variability in
ecosystem attributes of the St Marks National Wildlife
Refuge, Apalachee Bay, Florida. Estuarine, Coastal, and
Shelf Science 47: 329-349.
baydry,baywet| Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich.
1998. Network analysis of trophic dynamics in South
Florida ecosystems, FY 97: the Florida Bay ecosystem.
Annual Report to the United States Geological Service
Biological Resources Division, University of Miami Coral
Gables, [UMCES] CBL 98-123, Maryland System Center for
Environmental Science, Chesapeake Biological Laboratory,
Maryland, USA.
cypdry,cypwet| Ulanowicz, R. E., C. Bondavalli, and M. S. Egnotovich.
1997. Network analysis of trophic dynamics in South
Florida ecosystems, FY 96: the cypress wetland ecosystem.
Annual Report to the United States Geological Service
Biological Resources Division, University of Miami Coral
Gables, [UM-CES] CBL 97-075, Maryland System Center for
Environmental Science, Chesapeake Biological Laboratory.
gramdry,gramwet| Ulanowicz, R. E., J. J. Heymans, and M. S. Egnotovich.
2000. Network analysis of trophic dynamics in South
Florida ecosystems, FY 99: the graminoid ecosystem.
Technical Report TS-191-99, Maryland System Center for
Environmental Science, Chesapeake Biological Laboratory,
Maryland, USA.
mangdry,mangwet| Ulanowicz, R. E., C. Bondavalli, J. J. Heymans, and
M. S. Egnotovich. 1999. Network analysis of trophic
dynamics in South Florida ecosystems, FY 98: the mangrove
ecosystem. Technical Report TS-191-99, Maryland System
Center for Environmental Science, Chesapeake Biological
Laboratory, Maryland, USA."
Citations <- readLines(textConnection(citations))
Citations2 <- Citations[1]
for (i in 2:length(Citations)) {
if (grepl("^[ ]", Citations[i])) {
Citations2[length(Citations2)] <- paste(Citations2[length(Citations2)],
sub("^[ ]*", "", Citations[i]))
} else {
Citations2 <- c(Citations2, Citations[i])
}
}
Citations2 <- strsplit(Citations2, split="|", fixed=TRUE)
ids <- lapply(Citations2, function(x) strsplit(x[1], ",")[[1]])
cits <- sub("^[ ]*", "", sapply(Citations2, "[[", 2))
Citations2 <- cbind(unlist(ids), rep(cits, sapply(ids, length)))
url <- "http://vlado.fmf.uni-lj.si/pub/networks/data/bio/foodweb/foodweb.htm"
name <-
"ChesLower| Lower Chesapeake Bay in Summer
ChesMiddle| Middle Chesapeake Bay in Summer
ChesUpper| Upper Chesapeake Bay in Summer
Chesapeake| Chesapeake Bay Mesohaline Network
CrystalC| Crystal River Creek (Control)
CrystalD| Crystal River Creek (Delta Temp)
Maspalomas| Charca de Maspalomas
Michigan| Lake Michigan Control network
Mondego| Mondego Estuary - Zostrea site
Narragan| Narragansett Bay Model
Rhode| Rhode River Watershed - Water Budget
StMarks| St. Marks River (Florida) Flow network
baydry| Florida Bay Trophic Exchange Matrix, dry season
baywet| Florida Bay Trophic Exchange Matrix, wet season
cypdry| Cypress Dry Season
cypwet| Cypress Wet Season
gramdry| Everglades Graminoids - Dry Season
gramwet| Everglades Graminoids - Wet Season
mangdry| Mangrove Estuary, Dry Season
mangwet| Mangrove Estuary, Wet Season"
Name <- read.delim(textConnection(name), sep="|", header=FALSE)
Name[,2] <- sub("^[ ]*", "", Name[,2])
for (n in names(foodwebs)) {
foodwebs[[n]]$Citation <- Citations2[,2][match(n, Citations2[,1])]
foodwebs[[n]]$Author <- Authors[,2][match(n, Authors[,1])]
foodwebs[[n]]$URL <- url
foodwebs[[n]]$name <- Name[,2][match(n, Name[,1])]
}
save(foodwebs, file="/tmp/foodwebs.rda")
#####################################################################
## Konigsberg
library(igraph)
edges <- '
from,to,Euler_letter,name
Altstadt-Loebenicht,Kneiphof,a,Kraemer Bruecke
Altstadt-Loebenicht,Kneiphof,b,Schmiedebruecke
Altstadt-Loebenicht,Lomse,f,Holzbruecke
Kneiphof,Lomse,e,Honigbruecke
Vorstadt-Haberberg,Lomse,g,Hohe Bruecke
Vorstadt-Haberberg,Kneiphof,c,Gruene Bruecke
Vorstadt-Haberberg,Kneiphof,d,Koettelbruecke'
vertices <- "
name,Euler_letter
Altstadt-Loebenicht,B
Kneiphof,A
Vorstadt-Haberberg,C
Lomse,D"
Koenigsberg <- graph_from_data_frame(read.csv(textConnection(edges)),
vertices=read.csv(textConnection(vertices)),
directed=FALSE)
Koenigsberg$name <- "The seven bidges of Koenigsberg"
save(Koenigsberg, file="/tmp/Koenigsberg.rda")
########################################################################
## Yeast protein interactions
## library(igraph)
## tmp <- tempdir()
## url <- "http://vlado.fmf.uni-lj.si/pub/networks/data/bio/Yeast/yeast.zip"
## yzip <- paste(tmp, sep="/", "y.zip")
## download.file(url=url, destfile=yzip)
## system(paste("cd", tmp, ";", "unzip", yzip))
## YS <- read_graph(paste(tmp, sep="/", "YeastS.net"), format="pajek")
## YL <- read_graph(paste(tmp, sep="/", "YeastL.net"), format="pajek")
## cluLines <- readLines(paste(tmp, sep="/", "Yeast.clu"))
## cluLines <- cluLines[(grep("^\\*vertices", cluLines)+1):length(cluLines)]
## ccode <- c("1"="T", "2"="M", "3"="U", "4"="C", "5"="F", "6"="P",
## "7"="G", "8"="D", "9"="O", "10"="E", "11"="R", "12"="B", "13"="A")
## V(YS)$name <- V(YS)$id
## V(YS)$Long_name <- V(YL)$id
## YS <- delete_vertex_attr(YS, "id")
## V(YS)$Class <- ccode[cluLines]
## YS$name <- "Yeast protein interaction network by Bu et al. 2003"
## YS$Citation <- "Dongbo Bu, Yi Zhao, Lun Cai, Hong Xue, Xiaopeng Zhu, Hongchao Lu, Jingfen Zhang, Shiwei Sun, Lunjiang Ling, Nan Zhang, Guojie Li and Runsheng Chen: Topological structure analysis of the protein–protein interaction network in budding yeast. Nucl. Acids Res. (2003) 31 (9): 2443-2450."
## YS$Author <- "Dongbo Bu, Yi Zhao, Lun Cai, Hong Xue, Xiaopeng Zhu, Hongchao Lu, Jingfen Zhang, Shiwei Sun, Lunjiang Ling, Nan Zhang, Guojie Li and Runsheng Chen"
## YS$URL <- "http://www.bioinfo.org.cn/PIN/"
## class <-
## "Category,Description,Original MIPS category
## E,energy production,energy
## G,aminoacid metabolism,aminoacid metabolism
## M,other metabolism,all remaining metabolism categories
## P,translation,protein synthesis
## T,transcription,\"transcription, but without subcategory 'transcriptional control'\"
## B,transcriptional control,subcategory 'transcriptional control'
## F,protein fate,\"protein fate (folding, modification, destination)\"
## O,cellular organization,cellular transport and transport mechanisms
## A,transport and sensing,categories 'transport facilitation' and 'regulation of / interaction with cellular environment'
## R,stress and defense,\"cell rescue, defense and virulence\"
## D,genome maintenance,DNA processing and cell cycle
## C,cellular fate / organization,categories 'cell fate' and 'cellular communication / signal transduction' and 'control of cellular organization'
## U,uncharacterized,categories 'not yet clear-cut' and 'uncharacterized'
## "
## classes <- read.csv(textConnection(class), header=TRUE, stringsAsFactors=FALSE)
## YS$Classes <- classes
## yeast <- YS
## save(yeast, file="/tmp/yeast.rda")
###########################################################################
## Yeast protein interactions, from the van Mering paper
library(igraph)
library(org.Sc.sgd.db)
tmp <- tempdir()
urls <- paste(sep="", "http://www.nature.com/nature/journal/v417/n6887/extref/nature750-s", 1:4, ".doc")
dest <- paste(sep="", tmp, "/s", 1:4, ".txt")
sapply(1:4, function(x) download.file(url=urls[x], destfile=dest[x]))
## Proteins
vert <- readLines(paste(tmp, sep="/", "s1.txt"))
vert <- vert[grep("^Y", vert)[1]:length(vert)]
vert <- vert[vert != ""]
vert12 <- sub("\\][ ].*$", "]", vert)
vert12 <- read.delim(textConnection(paste(vert12, sep="\n")),
header=FALSE, stringsAsFactors=FALSE, sep=" ")
vert12[,2] <- sub("\\]", "", sub("\\[", "", vert12[,2]))
colnames(vert12) <- c("name", "Class")
vert3 <- sub("^[^ ]+[ ][^ ]+[ ]", "", vert)
## Connections
int <- readLines(paste(tmp, sep="/", "s4.txt"))
int <- int[grep("^Y", int)[1]:length(int)]
int <- int[int != ""]
fromto <- t(sapply(strsplit(int, "[ ]+"), "[", 1:2))
highconf <- grep("confidence: high", int)
highmed <- grep("confidence: low", int, invert=TRUE)
## Classes
class <-
"Category,Description,Original MIPS category
E,energy production,energy
G,aminoacid metabolism,aminoacid metabolism
M,other metabolism,all remaining metabolism categories
P,translation,protein synthesis
T,transcription,\"transcription, but without subcategory 'transcriptional control'\"
B,transcriptional control,subcategory 'transcriptional control'
F,protein fate,\"protein fate (folding, modification, destination)\"
O,cellular organization,cellular transport and transport mechanisms
A,transport and sensing,categories 'transport facilitation' and 'regulation of / interaction with cellular environment'
R,stress and defense,\"cell rescue, defense and virulence\"
D,genome maintenance,DNA processing and cell cycle
C,cellular fate / organization,categories 'cell fate' and 'cellular communication / signal transduction' and 'control of cellular organization'
U,uncharacterized,categories 'not yet clear-cut' and 'uncharacterized'
"
classes <- read.csv(textConnection(class), header=TRUE, stringsAsFactors=FALSE)
# Create the network
yeast <- graph_from_data_frame(fromto[highmed,], directed=FALSE)
yeast$name <- "Yeast protein interactions, von Mering et al."
yeast$Citation <- "Comparative assessment of large-scale data sets of protein-protein interactions. Christian von Mering, Roland Krause, Berend Snel, Michael Cornell, Stephen G. Oliver, Stanley Fields and Peer Bork. Nature 417, 399-403 (2002)"
yeast$Author <- "Christian von Mering, Roland Krause, Berend Snel, Michael Cornell, Stephen G. Oliver, Stanley Fields and Peer Bork"
yeast$URL <- "http://www.nature.com/nature/journal/v417/n6887/full/nature750.html"
yeast$Classes <- classes
V(yeast)$Class <- vert12[,2][match(V(yeast)$name, vert12[,1])]
V(yeast)$Description <- vert3[match(V(yeast)$name, vert12[,1])]
E(yeast)$Confidence <- ifelse(grepl("confidence: high", int[highmed]),
"high", "medium")
save(yeast, file="/tmp/yeast.rda")
###################################################################
## Zachary karate club
library(igraph)
tmp <- tempdir()
url <- "http://vlado.fmf.uni-lj.si/pub/networks/data/UciNet/zachary.dat"
dest <- paste(tmp, sep="/", "k.dat")
download.file(url=url, destfile=dest)
l <- readLines(dest)
l <- l[(grep("^DATA", l)+1):length(l)]
l1 <- matrix(scan(textConnection(paste(l[1:34], collapse="\n"))), nr=34)
l2 <- matrix(scan(textConnection(paste(l[1:34+34], collapse="\n"))), nr=34)
karate <- graph_from_adjacency_matrix(l2, weighted=TRUE, mode="undirected")
V(karate)$Faction <- c(1,1,1,1,1,1,1,1, 2,2, 1,1,1,1, 2,2, 1,1, 2, 1, 2, 1,
2,2,2,2,2,2,2,2,2,2,2,2)
karate$name <- "Zachary's karate club network"
karate$Citation <- "Wayne W. Zachary. An Information Flow Model for Conflict and Fission in Small Groups. Journal of Anthropological Research Vol. 33, No. 4 452-473"
karate$Author <- "Wayne W. Zachary"
save(karate, file="/tmp/karate.rda")
#####################################################################
## US airport network
tab <- read.csv("~/Downloads/1067890998_T_T100D_SEGMENT_ALL_CARRIER.csv")
tab <- tab[ tab$PASSENGERS != 0, ]
tab2 <- tab[,c("ORIGIN", "DEST", "UNIQUE_CARRIER_NAME", "DEPARTURES_PERFORMED", "SEATS", "PASSENGERS", "AIRCRAFT_TYPE", "DISTANCE")]
vert <- rbind(data.frame(name=tab$ORIGIN, CITY=tab$ORIGIN_CITY_NAME),
data.frame(name=tab$DEST,CITY=tab$DEST_CITY_NAME))
vert <- vert[ !duplicated(vert$name), ]
names(tab2) <- c("from", "to", "Carrier", "Departures", "Seats", "Passengers", "Aircraft", "Distance")
names(vert) <- c("name", "City")
library(igraph)
USairports <- graph_from_data_frame(tab2, vertices=vert)
USairports$name <- "US airports"
## Add positions
temp <- "http://www.armcode.com/airports/airport-%s.htm"
codes <- lapply(letters, function(x) {
print(x)
l <- readLines(sprintf(temp, x))
r <- grep('class="row3"', l, value=TRUE)
r2 <- sub("<TR><TD[^<]*</TD><TD[^<]*</TD><TD[^<]*</TD><TD[^>]*>", "", r)
r3 <- grep("^<", r2, invert=TRUE, value=TRUE)
c1 <- substr(r3, 1, 3)
c2 <- sub("^.*>(.*)</a></TD></TR>", "\\1", r3)
list(code=c1, pos=c2)
})
iata <- unlist(lapply(codes, "[[", 1))
pos <- unlist(lapply(codes, "[[", 2))
miss <- setdiff(V(USairports)$name, iata)
misspos <- sapply(miss, function(code) {
print(code)
try({
l <- readLines(sprintf("http://www.airnav.com/airport/%s", code))
e <- grep("Lat/Long: ", l, value=TRUE)
e2 <- sub("^.*Lat/Long: .*valign=top>([^<]*)<BR>.*$", "\\1", e)
g <- gsub("[^NSEW]", "", strsplit(e2, "/",)[[1]])
co <- round(as.numeric(gsub("[^0-9.]", "", strsplit(e2, "/")[[1]])))
paste(g, co, sep="")
})
})
stillmiss <- miss[sapply(misspos, inherits, "try-error")]
stillmiss <- cbind(stillmiss, V(USairports)[stillmiss]$City)
stillpos <- c("344059N 0902050W",
"664903N 1610120W",
"572817N 1534855W",
"573300N 1534500W",
"581000N 1523000W",
"574500N 1531900W",
"552431N 1321945W",
"621402N 1544405W",
"642215N 1611326W",
"635310N 1521807W",
"603522N 1520928W",
"594336N 1571533W",
"630150N 1633158W",
"551400N 1321300W",
"555656N 1333943W",
"555059N 1331340W",
"551421N 1320651W",
"581401N 1572101W",
"561904N 1583526W",
"592559N 1545827W",
"560021N 1603338W",
"605742N 1511954W",
"591900N 1545500W",
"355919N 1134836W",
"174449N 0644218W",
"181443N 0653836W",
"581300N 1573000W")
bak <- misspos
misspos[stillmiss[,1]] <- stillpos
misspos <- unlist(sapply(misspos, paste, collapse=" "))
misspos <- sub("([0-9]+)([NS]) ([0-9]+)([WE])", "\\2\\1 \\4\\3", misspos)
iata <- c(iata, names(misspos))
pos <- c(pos, unname(misspos))
V(USairports)$Position <- pos[match(V(USairports)$name, iata)]
save(USairports, file="/tmp/USairports.rda")
#####################################################################
## Kite
kite <- make_graph('krackhardt_kite')
kite$layout <- matrix(nc = 2, byrow = TRUE,
c(1,4, 1,2, 2,5, 2,3, 2,1, 3,4, 3,2, 4,3, 5,3, 6,3))
V(kite)$label <- LETTERS[1:10] # $
V(kite)$name <- V(kite)$label
V(kite)$Firstname <- c("Andre", "Beverly", "Carol", "Diane", "Ed",
"Fernando", "Garth", "Heather", "Ike", "Jane")
kite$name <- "Krackhardt's kite"
kite$Citation <- "Assessing the Political Landscape: Structure, Cognition, and Power in Organizations. David Krackhardt. Admin. Sci. Quart. 35, 342-369, 1990."
kite$Author <- "David Krackhardt"
kite$URL <- "http://www.orgnet.com/sna.html"
save(kite, file = "/tmp/kite.rda")
#####################################################################
## ENRON
vert <- read.delim("http://www.cis.jhu.edu/~parky/Enron/employees",
col.names = c("email", "other"),
header = FALSE, stringsAsFactors = FALSE)
vert$name <- sapply(strsplit(vert$other, " [ ]*"), "[[", 1)
vert$note <- paste(sapply(strsplit(vert$other, " [ ]*"), "[", 2),
sep = ", ",
sapply(strsplit(vert$other, " [ ]*"), "[", 3))
vert$note <- gsub("N/A", "NA", vert$note)
vert$note <- gsub(", NA", "", vert$note)
vert$name <- gsub("xxx", "NA", vert$name)
edges <- read.delim("http://www.cis.jhu.edu/~parky/Enron/execs.email.linesnum",
header = FALSE, col.names = c("time", "from", "to"),
stringsAsFactors = FALSE, sep = " ")
tags <- read.delim("http://www.cis.jhu.edu/~parky/Enron/execs.email.lines2",
header = FALSE, col.names = c("time", "from", "to", "tag"),
stringsAsFactors = FALSE, sep = " ")
all(tags[,1:3] == edges)
topics <- read.delim("http://www.cis.jhu.edu/~parky/Enron/execs.email.linesnum.topic",
header = TRUE, col.names = c("time", "from", "to", "topic"),
stringsAsFactors = FALSE, sep = " ")
all(topics[,1:3] == edges)
ldc_topics <- read.delim("http://www.cis.jhu.edu/~parky/Enron/execs.email.linesnum.ldctopic",
header = TRUE, col.names = c("time", "from", "to", "ldc_topic"),
stringsAsFactors = FALSE, sep = " ")
all(ldc_topics[,1:3] == edges)
ldc <- c("Calif_analysis", "Calif_bankruptcy", "Calif_utilities",
"Calif_crisis_legal", "Calif_enron", "Calif_federal",
"Newsfeed_Calif", "Calif_legis", "Daily_business",
"Educational", "EnronOnline", "Kitchen_daily",
"Kitchen_fortune", "Energy_newsfeed", "General_newsfeed",
"Downfall", "Downfall_newsfeed", "Broadband",
"Federal_gov", "FERC_DOE", "College Football",
"Pro Football", "India_General", "India_Dabhol",
"Nine_eleven", "Nine_Eleven_Analysis", "Dynegy",
"Sempra", "Duke", "El Paso",
"Pipelines", "World_energy")
ldc_desc_text <- "
Executive summaries and analyses about the California situation. (304
entries)
Specifically mentioned financial difficulties of the utilities such as
Southern California Edison (SoCal Edison) and Pacific Gas & Electric
(PG & E). (36 entries)
General references to California utility companies: Edison, Pacific Gas &
Electric, and the California Public Utility Commission (CPUC) which
regulates them. (116)
Articles about legal issues surrounding California energy crisis. (109)
Enron business emails about the day to day operations of managing the
California side of their business. (699)
Emails about FERC (Federal Energy Regulatory Commission), U.S. Senate
Hearings. (61)
Long emails with a host of stories about California. These emails were news
feeds from wire services such as Reuters and Dow Jones, which were widely
circulated among Enron employees. (190)
Emails about California legislature, bills in the California legislature or
California Governor Gray Davis that are not related to the specifics such
as bankruptcy or the energy crisis. (181)
As one might expect, the majority of the emails in this collection are
emails about the regular day to day activities of a multinational energy
company (i.e. “ trade this share, buy these shares,” etc.). Other daily
business emails include setting up meetings, confirming meetings, and
general announcements from human resources. These almost defy
categorization by topic, but they do have a value. Researchers may decide
to remove these emails to reduce the amount of noise in the collection and
to improve their ability to detect topics. However keeping them in the
collection provides an element of noise that gives the collection a “real
life” quality. Either way by tagging such emails, the researcher has the
option. (1595)
This was a surprise topic that emerged later. It related to Enron's
interns, scholarships or employees who are professors. Many of these emails
center around the Head of the Research Group Vince Kaminski who taught at
Rice University in Houston part time. (92)
Enrononline is the electronic trading and information software tool that
the Enron traders used. It was an invaluable asset to the company and gave
them an edge on their competitors. Louise Kitchen was an early developer of
the technology. (271)
Daily emails to and from Louise Kitchen who developed Enrononline. This
category includes questions to Kitchen about running EOL and trading
information. (37)
Louise Kitchen was selected as one to the top corporate women in a Fortune
magazine story (September 2001). (11)
Wire news feeds about various energy issues. Think of it as an electronic
newsletter about energy that is circulation to a number of Enron employees.
Usually these are lengthy emails. (332)
Long emails (wire feeds) with a host of general national and international
stories. (48)
Articles about Enron's demise. Messages from employees worrying about what
is going on. This includes announcements from management about “not
worrying about it.” (158)
Wire stories about Enron's demise. (48)
Enron Broadband Services (EBS) Enron's failed Broadband (high speed cable
to deliver entertainment) venture. (26)
General information about Federal government that does not specifically
mention California. (85)
General information about the Federal Energy Regulatory
Commission/Department of Energy. (219)
Employee emails about college football more specifically a newsletter
called TruOrange, which follows University of Texas football. (100)
Employee emails about professional football (The NFL), but these refer to
fantasy pro football leagues, where the statistics of real players are used
to play an online version of football. (6)
General information about the India energy issues. (38)
Specific references to India Dabhol Power Company (DPC), the Maharastra
State Electricity Board (MSEB), and the Indian province of Maharastra. (79)
The terrorist attack of September 11, 2001. Mostly newscasts and
updates. (29)
Aftermath analysis (political and economic) resulting from the attack. (30)
This company was a competitor of Enron. They almost purchased Enron in
Oct-Nov. 2001, but let Enron plummet into bankruptcy instead. (7)
A utilities company that works with Enron. (16)
Emails about Duke Energy. (17)
Emails about El Paso Energy/Pipeline Company. (34)
General pipeline management. Note that pipelines are important part in
transporting energy from one place to another. Enron’s original business
was a pipeline business. (17)
A general category about energy with one or more specific geographic
locations (such as Asia, Africa) that is not about India. (25)
"
ldc_desc <- strsplit(ldc_desc_text, "\n\n")[[1]]
ldc_desc <- gsub("\n", " ", ldc_desc)
ldc_desc <- gsub("^[[:space:]]+", "", ldc_desc)
ldc_desc <- gsub("[[:space:]]+$", "", ldc_desc)
reciptype_code <- c("0" = "to", "1" = "cc", "2" = "bcc")
reciptype <- unname(reciptype_code[as.character(tags[,4])])
g_edges <- data.frame(
stringsAsFactors = FALSE,
edges,
reciptype = reciptype,
topic = topics[,4],
ldc_topic = ldc_topics[,4]
)
g_edges$time <- as.character(as.POSIXct(edges$time, origin = "1970-01-01",
tz = "UTC"))
g_edges <- g_edges[c("from", "to", "time", "reciptype", "topic", "ldc_topic")]
names(g_edges) <- c("from", "to", "Time", "Reciptype", "Topic", "LDC_topic")
g_vert <- cbind(id = seq_len(nrow(vert)) - 1, vert)
g_vert <- g_vert[, colnames(g_vert) != "other"]
names(g_vert) <- c("id", "Email", "Name", "Note")
enron <- make_directed_graph(t(as.matrix(g_edges[,1:2])) + 1)
V(enron)$Email <- g_vert$Email
V(enron)$Name <- g_vert$Name
V(enron)$Note <- g_vert$Note
E(enron)$Time <- g_edges$Time
E(enron)$Reciptype <- g_edges$Reciptype
E(enron)$Topic <- g_edges$Topic
E(enron)$LDC_topic <- g_edges$LDC_topic
enron$LDC_names <- ldc
enron$LDC_desc <- ldc_desc
enron$name <- "Enron email network"
enron$Citation <- c('C.E. Priebe, J.M. Conroy, D.J. Marchette, and Y. Park, "Scan Statistics on Enron Graphs," Computational and Mathematical Organization Theory, Volume 11, Number 3, p229 - 247, October 2005, Springer Science+Business Media B.V.')
#####################################################################
## RFID
library(sand)
data(hc)
edges <- hc[,c(2,3,1)]
vv <- character(max(hc[,2:3]))
vv[hc$ID1] <- as.character(hc$S1)
vv[hc$ID2] <- as.character(hc$S2)
v <- data.frame(id = seq_along(vv), Status = vv, stringsAsFactors = FALSE)
rfid <- graph_from_data_frame(edges, vertices = v, directed = FALSE)
rfid <- delete_vertex_attr(rfid, 'name')
rfid$name <- "RFID hospital encounter network"
rfid$Citation <- "P. Vanhems, A. Barrat, C. Cattuto, J.-F. Pinton, N. Khanafer, C. Regis, B.-a. Kim, B. Comte, N. Voirin: Estimating potential infection transmission routes in hospital wards using wearable proximity sensors. PloS One 8(9), e73970 306 (2013)."
| 30,568 | cc-by-sa-4.0 |
assertthat | cran-assertthat-b28a7b8/R/assert-that.r | #' Assert that certain conditions are true.
#'
#' \code{assert_that} is a drop-in replacement for \code{\link{stopifnot}} but
#' is designed to give informative error messages.
#'
#' @section Assertions:
#'
#' Assertion functions should return a single \code{TRUE} or \code{FALSE}:
#' any other result is an error, and \code{assert_that} will complain about
#' it. This will always be the case for the assertions provided by
#' \code{assertthat}, but you may need be a more careful for
#' base R functions.
#'
#' To make your own assertions that work with \code{assert_that},
#' see the help for \code{\link{on_failure}}. Alternatively, a custom message
#' can be specified for each call.
#'
#' @param ... unnamed expressions that describe the conditions to be tested.
#' Rather than combining expressions with \code{&&}, separate them by commas
#' so that better error messages can be generated.
#' @param env (advanced use only) the environment in which to evaluate the
#' assertions.
#' @param msg a custom error message to be printed if one of the conditions is
#' false.
#' @seealso \code{\link{validate_that}}, which returns a message (not an error)
#' if the condition is false.
#' @export
#' @examples
#' x <- 1
#' # assert_that() generates errors, so can't be usefully run in
#' # examples
#' \dontrun{
#' assert_that(is.character(x))
#' assert_that(length(x) == 3)
#' assert_that(is.dir("asdf"))
#' y <- tempfile()
#' writeLines("", y)
#' assert_that(is.dir(y))
#' assert_that(FALSE, msg = "Custom error message")
#' }
#'
#' # But see_if just returns the values, so you'll see that a lot
#' # in the examples: but remember to use assert_that in your code.
#' see_if(is.character(x))
#' see_if(length(x) == 3)
#' see_if(is.dir(17))
#' see_if(is.dir("asdf"))
#' see_if(5 < 3, msg = "Five is not smaller than three")
assert_that <- function(..., env = parent.frame(), msg = NULL) {
res <- see_if(..., env = env, msg = msg)
if (res) return(TRUE)
stop(assertError(attr(res, "msg")))
}
assertError <- function (message, call = NULL) {
class <- c("assertError", "simpleError", "error", "condition")
structure(list(message = message, call = call), class = class)
}
#' @rdname assert_that
#' @export
see_if <- function(..., env = parent.frame(), msg = NULL) {
asserts <- eval(substitute(alist(...)))
for (assertion in asserts) {
res <- tryCatch({
eval(assertion, env)
}, assertError = function(e) {
structure(FALSE, msg = e$message)
})
check_result(res)
# Failed, so figure out message to produce
if (!res) {
if (is.null(msg))
msg <- get_message(res, assertion, env)
return(structure(FALSE, msg = msg))
}
}
res
}
check_result <- function(x) {
if (!is.logical(x))
stop("assert_that: assertion must return a logical value", call. = FALSE)
if (any(is.na(x)))
stop("assert_that: missing values present in assertion", call. = FALSE)
if (length(x) != 1) {
stop("assert_that: length of assertion is not 1", call. = FALSE)
}
TRUE
}
get_message <- function(res, call, env = parent.frame()) {
stopifnot(is.call(call), length(call) >= 1)
if (has_attr(res, "msg")) {
return(attr(res, "msg"))
}
f <- eval(call[[1]], env)
if (!is.primitive(f)) call <- match.call(f, call)
fname <- deparse(call[[1]])
fail <- on_failure(f) %||% base_fs[[fname]] %||% fail_default
fail(call, env)
}
# The default failure message works in the same way as stopifnot, so you can
# continue to use any function that returns a logical value: you just won't
# get a friendly error message.
# The code below says you get the first 60 characters plus a ...
fail_default <- function(call, env) {
call_string <- deparse(call, width.cutoff = 60L)
if (length(call_string) > 1L) {
call_string <- paste0(call_string[1L], "...")
}
paste0(call_string, " is not TRUE")
}
| 3,882 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/assertions-file.r | #' @include on-failure.r
NULL
path_is_not <- function(thing, var = "x") {
function(call, env) {
paste0("Path '", eval(call[[var]], env), "' is not ", thing)
}
}
#' Useful test related to files
#'
#' @param path a file path to examine
#' @name assertions-file
#' @examples
#' see_if(is.dir(1))
#'
#' tmp <- tempfile()
#' see_if(file.exists(tmp))
#' see_if(is.dir(tmp))
#'
#' writeLines("x", tmp)
#' see_if(file.exists(tmp))
#' see_if(is.dir(tmp))
#' see_if(is.writeable(tmp))
#' see_if(is.readable(tmp))
#' unlink(tmp)
#'
#' see_if(is.readable(tmp))
NULL
#' @export
#' @rdname assertions-file
is.dir <- function(path) {
assert_that(is.string(path), file.exists(path))
file.info(path)$isdir
}
on_failure(is.dir) <- path_is_not("a directory", "path")
#' @export
#' @rdname assertions-file
is.writeable <- function(path) {
assert_that(is.string(path), file.exists(path))
file.access(path, mode = 2)[[1]] == 0
}
on_failure(is.writeable) <- path_is_not("writeable", "path")
#' @export
#' @rdname assertions-file
is.readable <- function(path) {
assert_that(is.string(path), file.exists(path))
file.access(path, mode = 4)[[1]] == 0
}
on_failure(is.readable) <- path_is_not("readable", "path")
#' @param ext extension to test for (\code{has_extension} only)
#' @export
#' @rdname assertions-file
has_extension <- function(path, ext) {
tools::file_ext(path) == ext
}
on_failure(has_extension) <- function(call, env) {
path <- eval(call$path, env)
ext <- eval(call$ext, env)
paste0("File '", basename(path), "' does not have extension ", ext)
}
| 1,568 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/assertions-scalar.R | #' @include on-failure.r
NULL
#' Assert input is a scalar.
#'
#' \code{is.scalar} provides a generic method for checking input is a scalar.
#' \code{is.string}, \code{is.flag}, \code{is.number} and \code{is.count}
#' provide tests for specific types.
#'
#' @family assertions
#' @param x object to test
#' @name scalar
#' @aliases NULL
NULL
#' @rdname scalar
#' @export
#' @examples
#' # Generic check for scalars
#' see_if(is.scalar("a"))
#' see_if(is.scalar(1:10))
#'
is.scalar <- function(x) {
length(x) == 1L
}
on_failure(is.scalar) <- function(call, env) {
type <- eval(call$type, env)
paste0(deparse(call$x), " is not a scalar.")
}
#' @rdname scalar
#' @export
#' @examples
#' # string = scalar character vector
#' see_if(is.string(1:3))
#' see_if(is.string(c("a", "b")))
#' see_if(is.string("x"))
#'
is.string <- function(x) is.character(x) && length(x) == 1
on_failure(is.string) <- function(call, env) {
paste0(deparse(call$x), " is not a string (a length one character vector).")
}
#' @rdname scalar
#' @export
#' @examples
#' # number = scalar numeric/integer vector
#' see_if(is.number(1:3))
#' see_if(is.number(1.5))
#'
is.number <- function(x) is.numeric(x) && length(x) == 1
on_failure(is.number) <- function(call, env) {
paste0(deparse(call$x), " is not a number (a length one numeric vector).")
}
#' @rdname scalar
#' @export
#' @examples
#' # flag = scalar logical vector
#' see_if(is.flag(1:3))
#' see_if(is.flag("a"))
#' see_if(is.flag(c(FALSE, FALSE, TRUE)))
#' see_if(is.flag(FALSE))
#'
is.flag <- function(x) is.logical(x) && length(x) == 1
on_failure(is.flag) <- function(call, env) {
paste0(deparse(call$x), " is not a flag (a length one logical vector).")
}
#' @rdname scalar
#' @export
#' @examples
#' # count = scalar positive integer
#' see_if(is.count("a"))
#' see_if(is.count(-1))
#' see_if(is.count(1:5))
#' see_if(is.count(1.5))
#' see_if(is.count(1))
#'
is.count <- function(x) {
if (length(x) != 1) return(FALSE)
if (!is.integerish(x)) return(FALSE)
# is.na() to handle NA_integer_
x > 0 && !is.na(x)
}
on_failure(is.count) <- function(call, env) {
paste0(deparse(call$x), " is not a count (a single positive integer)")
}
| 2,199 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/assertions.r | #' @include on-failure.r
NULL
is.integerish <- function(x) {
# using trunc() to deal with very large numbers (including Inf) and is.na() to deal with NaN and NA_real_
res <- is.integer(x) || (is.numeric(x) && all(x == trunc(x)) && !is.na(x))
res
}
# is.positive.integer
# is.negative.integer
# is.positive.double
# is.negative.double
is.named <- function(x) {
nm <- names(x)
!is.null(nm) && all(!is.na(nm) & nm != "")
}
on_failure(is.named) <- function(call, env) {
paste0("Not all elements of ", deparse(call$x), " have names.")
}
#' Has attribute or name?
#'
#' @param x object to test
#' @param which name or attribute
#' @export
#' @examples
#' has_attr(has_attr, "fail")
#' x <- 10
#' x %has_attr% "a"
#'
#' y <- list(a = 1, b = 2)
#' see_if(y %has_name% "c")
#' see_if(y %has_name% c("a", "g", "f"))
has_attr <- function(x, which) !is.null(attr(x, which, exact = TRUE))
on_failure(has_attr) <- function(call, env) {
paste0(deparse(call$x), " does not have attribute ", eval(call$which, env))
}
#' @export
#' @rdname has_attr
"%has_attr%" <- has_attr
#' @export
#' @rdname has_attr
has_name <- function(x, which){
all(which %in% names(x))
}
on_failure(has_name) <- function(call, env) {
out_names <- paste0("'", paste0(eval(call$which, env), collapse = "', '"), "'")
paste0(deparse(call$x), " does not have all of these name(s): ", out_names)
}
#' @export
#' @rdname has_attr
"%has_name%" <- has_name
#' Does object contain any missing values?
#'
#' @family assertions
#' @param x object to test
#' @export
#' @examples
#' see_if(noNA("a"))
#' see_if(noNA(c(TRUE, NA)))
#' x <- sample(c(1:10, NA), 100, rep = TRUE)
#' see_if(noNA(x))
noNA <- function(x) {
!(any(is.na(x)))
}
on_failure(noNA) <- function(call, env) {
n <- sum(is.na(eval(call$x, env)))
paste0(deparse(call$x), " contains ", n, " missing values")
}
#' Are two objects equal?
#'
#' @param x,y objects to compare
#' @param ... additional arguments passed to \code{\link{all.equal}}
#' @family assertions
#' @export
#' @examples
#' x <- 2
#' see_if(are_equal(x, 1.9))
#' see_if(are_equal(x, 1.999, tol = 0.01))
#' see_if(are_equal(x, 2))
are_equal <- function(x, y, ...) {
isTRUE(all.equal(x, y, ...))
}
on_failure(are_equal) <- function(call, env) {
paste0(deparse(call$x), " not equal to ", deparse(call$y))
}
#' Missing is functions.
#'
#' @param x object to test
#' @family assertions
#' @name assert-is
#' @aliases NULL
#' @examples
#' a <- Sys.time()
#' is.time(a)
#' b <- Sys.Date()
#' is.date(b)
#' c <- try(stop("!!"))
#' is.error(c)
NULL
#' @export
#' @rdname assert-is
is.error <- function(x) inherits(x, "try-error")
on_failure(is.error) <- function(call, env) {
paste0(deparse(call$x), " is not a try-error")
}
#' @export
#' @rdname assert-is
is.time <- function(x) inherits(x, "POSIXt")
on_failure(is.time) <- function(call, env) {
paste0(deparse(call$x), " is not a POSIXt date-time object")
}
#' @export
#' @rdname assert-is
is.date <- function(x) inherits(x, "Date")
on_failure(is.date) <- function(call, env) {
paste0(deparse(call$x), " is not a Date object")
}
#' Check a function has specified arguments
#'
#' @param f a function
#' @param args a character vector of argument names
#' @param exact if \code{TRUE}, argument names must match \code{args}
#' exactly (order and value); otherwise \code{f} just must have at least
#' \code{args} in any order
#' @export
#' @examples
#' has_args(mean, "x")
#' has_args(mean, "x", exact = TRUE)
#'
#' see_if(mean %has_args% "x")
#' see_if(mean %has_args% "y")
has_args <- function(f, args, exact = FALSE) {
assert_that(is.function(f))
if (exact) {
identical(args, names(formals(f)))
} else {
all(args %in% names(formals(f)))
}
}
on_failure(has_args) <- function(call, env) {
args <- paste(eval(call$args, env), collapse = ", ")
paste0("Function " , deparse(call$f), " does not have arguments ", args)
}
#' @export
#' @rdname has_args
"%has_args%" <- function(f, args) has_args(f, args)
#' Check an object doesn't have any empty dimensions
#'
#' @param x object to test
#' @family assertions
#' @export
#' @examples
#' not_empty(numeric())
#' not_empty(mtcars[0, ])
#' not_empty(mtcars[, 0])
not_empty <- function(x) {
all((dim(x) %||% length(x)) != 0)
}
on_failure(not_empty) <- function(call, env) {
paste0(deparse(call$x), " has an empty dimension")
}
| 4,368 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/base-comparison.r | #' @include base.r
NULL
logical_is_not <- function(failed) {
function(call, env) {
lhs <- paste(deparse(call[[2]]), collapse = "")
rhs <- paste(deparse(call[[3]]), collapse = "")
paste0(lhs, " not ", failed, " ", rhs)
}
}
base_fs$"==" <- logical_is_not("equal to")
base_fs$"<" <- logical_is_not("less than")
base_fs$">" <- logical_is_not("greater than")
base_fs$">=" <- logical_is_not("greater than or equal to")
base_fs$"<=" <- logical_is_not("less than or equal to")
base_fs$"!=" <- logical_is_not("not equal to")
| 536 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/base-is.r | #' @include base.r
NULL
is_not <- function(thing) {
function(call, env) {
paste0(deparse(call[[2]]), " is not ", thing)
}
}
# Vectors
base_fs$is.atomic <- is_not("an atomic vector")
base_fs$is.character <- is_not("a character vector")
base_fs$is.complex <- is_not("a complex vector")
base_fs$is.double <- is_not("a numeric vector")
base_fs$is.integer <- is_not("an integer vector")
base_fs$is.numeric <- is_not("a numeric or integer vector")
base_fs$is.raw <- is_not("a raw vector")
base_fs$is.vector <- is_not("an atomic vector without attributes")
# Factors
base_fs$is.factor <- is_not("a factor")
base_fs$is.ordered <- is_not("an ordered factor")
# More complicated data structures
base_fs$is.array <- is_not("an array")
base_fs$is.data.frame <- is_not("a data frame")
base_fs$is.list <- is_not("a list")
base_fs$is.matrix <- is_not("a matrix")
base_fs$is.null <- is_not("NULL")
# Functions and environments
base_fs$is.environment <- is_not("an environment")
base_fs$is.function <- is_not("a function")
base_fs$is.primitive <- is_not("a primitive function")
# Computing on the language
base_fs$is.call <- is_not("a quoted call")
base_fs$is.expression <- is_not("an expression object")
base_fs$is.name <- is_not("a name")
base_fs$is.pairlist <- is_not("a pairlist")
base_fs$is.recursive <- is_not("a recursive object")
base_fs$is.symbol <- is_not("a name")
# Catch all
base_fs$inherits <- function(call, env) {
class <- eval(call$what, env)
paste0(deparse(call$x), " does not inherit from class ", class)
}
| 1,529 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/base-logical.r | #' @include base.r
NULL
base_fs$"&&" <- function(call, env) {
lhs <- eval(call[[2]], env)
if (!lhs) {
get_message(lhs, call[[2]], env)
} else {
rhs <- eval(call[[3]], env)
get_message(rhs, call[[3]], env)
}
}
base_fs$"||" <- function(call, env) {
lhs <- eval(call[[2]], env)
l_msg <- get_message(lhs, call[[2]], env)
rhs <- eval(call[[3]], env)
r_msg <- get_message(rhs, call[[3]], env)
paste0(l_msg, " or ", r_msg)
}
base_fs$any <- function(call, env) {
paste0("No elements of ", deparse(call[[2]]), " are true")
}
base_fs$all <- function(call, env) {
res <- eval(call[[2]], env)
i <- which(!res)
if (length(i) > 10) i <- c(i[1:5], "...")
paste0("Elements ", paste(i, collapse = ", "), " of ",
deparse(call[[2]]), " are not true")
}
| 784 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/base-misc.r | #' @include base.r
NULL
base_fs$file.exists <- function(call, env) {
path <- eval(call[[2]], env)
paste0("Path '", path, "' does not exist")
}
base_fs$anyDuplicated <- function(call, env) {
paste0(call$x, " is not unique")
}
base_fs$identical <- function(call, env) {
paste0(deparse(call$x), " not identical to ", deparse(call$y))
}
| 344 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/base.r | base_fs <- new.env(parent = emptyenv())
| 40 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/on-failure.r | #' Custom failure messages for assertions.
#'
#' @param x a assertion function that returns \code{TRUE} if the assertion
#' is met, \code{FALSE} otherwise.
#' @param value a function with parameters \code{call} and \code{env}
#' that returns a custom error message as a string.
#' @export
#' @examples
#' is_odd <- function(x) {
#' assert_that(is.numeric(x), length(x) == 1)
#' x %% 2 == 1
#' }
#' see_if(is_odd(2))
#'
#' on_failure(is_odd) <- function(call, env) {
#' paste0(deparse(call$x), " is even")
#' }
#' see_if(is_odd(2))
on_failure <- function(x) attr(x, "fail")
#' @export
#' @rdname on_failure
#' @usage on_failure(x) <- value
"on_failure<-" <- function(x, value) {
stopifnot(is.function(x), identical(names(formals(value)), c("call", "env")))
attr(x, "fail") <- value
x
}
| 803 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/utils.r | "%||%" <- function(a, b) if (is.null(a)) b else a
| 50 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/R/validate-that.R | #' Validate that certain conditions are true.
#'
#' \code{validate_that} is an alternative to the function
#' \code{\link{assert_that}}, that returns a \code{character} vector. This
#' makes them easier to use within S4 \code{"validate"} methods.
#'
#' @inheritParams assert_that
#' @return A \code{character} vector if the assertion is false, or \code{TRUE}
#' if the assertion is true.
#' @export
#' @seealso \code{\link{assert_that}}, which returns an error if the condition
#' is false.
#' @examples
#' x <- 1
#' # assert_that() generates errors, so can't be usefully run in
#' # examples
#' validate_that(is.numeric(x))
#' validate_that(is.character(x))
#' validate_that(length(x) == 3)
#' validate_that(is.dir("asdf"))
validate_that <- function(..., env = parent.frame(), msg = NULL) {
res <- see_if(..., env = env, msg = msg)
if (res) return(TRUE)
return(attr(res, "msg"))
}
| 888 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat.R | library(testthat)
library(assertthat)
test_check("assertthat")
| 64 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-assert-that.R | context("assert_that")
test_that("assert_that handles long false assertions gracefully", {
expect_error(
assert_that(isTRUE(10 + sqrt(25) + sum(1:10) + sqrt(25) + sum(11:20) + sqrt(25) + sum(21:30) + sqrt(25) + sum(31:40) + sqrt(25) + sum(41:50))),
"^isTRUE\\(.* [.]{3} is not TRUE$"
)
})
test_that("assert_that handles has_name failures with multiple missing names", {
x <- list(a = TRUE, b = "hello")
expect_error(
assert_that(has_name(x, c("a", "f", "g"))),
regexp = "x does not have all of these name"
)
})
| 565 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-assertions.R | context("Assertion assertions")
test_that("is.integerish works correctly", {
expect_true(is.integerish(1L))
expect_true(is.integerish(c(1L, 2L, 3L)))
expect_true(is.integerish(c(1L, NA, 3L)))
expect_false(is.integerish(c(1L, 2.1, 3L)))
# base::.Machine holds info on machine numerical precision
expect_false(is.integerish(1L + .Machine$double.eps))
expect_false(is.integerish(1L - .Machine$double.neg.eps))
# numbers larger than base::.Machine$integer.max shouldn't trip this up
expect_true(is.integerish(Inf))
expect_true(is.integerish(-Inf))
expect_true(is.integerish(1e10))
expect_true(is.integerish(-1e10))
expect_false(is.integerish(1e10 + 0.0002))
expect_false(is.integerish(1e10 - 0.0002))
expect_false(is.integerish(NA))
expect_false(is.integerish(NA_real_))
expect_false(is.integerish(NULL))
expect_false(is.integerish(NaN))
})
test_that("is.named works correctly", {
expect_false(is.named(1))
x <- 1:3
expect_false(is.named(x))
names(x) <- letters[1:3]
expect_true(is.named(x))
# Malformed or weird names
names(x)[2] <- ""
expect_false(is.named(x))
names(x)[2] <- NA
expect_false(is.named(x))
names(x) <- NULL
expect_false(is.named(x))
expect_false(is.named(NA))
expect_false(is.named(NULL))
})
test_that("has_attr works correctly", {
x <- 1:3
expect_false(has_attr(x, "names"))
names(x) <- letters[1:3]
expect_true(has_attr(x, "names"))
expect_false(has_attr(x, "something else"))
# not sure what else to test here
})
test_that("has_name works correctly", {
x <- 1:3
expect_false(has_name(x, "a"))
names(x) <- letters[1:3]
expect_true(has_name(x, letters[2]))
expect_false(has_name(x, "something else"))
expect_false(has_name(x, NA))
expect_true(has_name(x, c("a", "b")))
expect_true(has_name(x, c("a", "b", "c")))
expect_false(has_name(x, c("a", "d")))
})
test_that("noNA works correctly", {
expect_true(noNA("a"))
expect_false(noNA(c(TRUE, NA)))
x <- sample(c(1:10, NA), 100, rep = TRUE)
expect_false(noNA(x))
expect_true(noNA(1:1000))
})
test_that("are_equal works correctly", {
x <- 2
expect_false(are_equal(x, 1.9))
expect_true(are_equal(x, 1.999, tol = 0.01))
expect_true(are_equal(x, 2))
expect_true(are_equal('a', 'a'))
expect_false(are_equal('a', 'b'))
expect_true(are_equal(NA, NA))
expect_true(are_equal(NULL, NULL))
})
test_that("is.error works correctly", {
x <- try(stop("!!"), silent=TRUE)
expect_true(is.error(x))
expect_false(is.error(1))
expect_false(is.error(NA))
expect_false(is.error(NULL))
})
test_that("is.time works correctly", {
expect_true(is.time(Sys.time()))
expect_false(is.time(Sys.Date()))
expect_false(is.time(1))
expect_false(is.time(NA))
expect_false(is.time(NULL))
})
test_that("is.date works correctly", {
expect_false(is.date(Sys.time()))
expect_true(is.date(Sys.Date()))
expect_false(is.date(1))
expect_false(is.date(NA))
expect_false(is.date(NULL))
})
test_that("has_args works correctly", {
expect_error(1 %has_args% "x")
expect_true(mean %has_args% "x")
expect_false(mean %has_args% "y")
expect_error(NA %has_args% "x")
expect_error(NULL %has_args% "x")
# should pass with exact = FALSE if you don't have all the args or you the order is different
expect_true(has_args(rnorm, "n"))
expect_true(has_args(rnorm, c("n", "mean")))
expect_true(has_args(rnorm, c("mean", "sd", "n")))
# should pass with exact = TRUE if you don't have all the args or you the order is different
expect_false(has_args(rnorm, "n", exact = TRUE))
expect_false(has_args(rnorm, c("n", "mean"), exact = TRUE))
expect_false(has_args(rnorm, c("mean", "sd", "n"), exact = TRUE))
})
test_that("not_empty works correctly", {
expect_true(not_empty(1))
expect_false(not_empty(numeric()))
expect_false(not_empty(mtcars[0, ]))
expect_false(not_empty(mtcars[, 0]))
expect_true(not_empty(NA))
expect_false(not_empty(NULL))
})
| 3,951 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-base-comparison.R | context("base-comparison")
test_that("assert_that respects custom error messages for base operators", {
expect_error(assert_that(5 == 'i'), "not equal to")
expect_error(assert_that(5 < 4), "not less than")
expect_error(assert_that(4 > 5), "not greater than")
expect_error(assert_that(4 >= 5), "not greater than or equal to")
expect_error(assert_that(5 <= 4), "not less than or equal to")
expect_error(assert_that(5 != 5), "not equal to")
})
| 454 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-base.R | context("Base assertions")
test_that("any message is useful", {
expect_equal(validate_that(any(TRUE, FALSE)), TRUE)
x <- c(FALSE, FALSE)
expect_equal(validate_that(any(x)), "No elements of x are true")
})
test_that("all message is useful", {
expect_equal(validate_that(all(TRUE, TRUE)), TRUE)
x <- c(FALSE, TRUE)
expect_match(validate_that(all(x)), "Elements .* of x are not true")
})
test_that("custom message is printed", {
expect_equal(validate_that(FALSE, msg = "Custom message"), "Custom message")
})
| 529 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-file.R | context("File assertions")
test_that("is.dir identifies dirs correctly", {
expect_true(is.dir(tempdir()))
expect_error(is.dir(tempfile()))
})
test_that("is.writeable works correctly", {
expect_true(is.writeable(tempdir()))
tf <- tempfile()
expect_error(is.writeable(tf)) # file doesn't exist yet
cat("foo", file=tf)
expect_true(is.writeable(tf)) # ...but now it does
})
test_that("is.readable works correctly", {
expect_true(is.readable(tempdir()))
tf <- tempfile()
expect_error(is.readable(tf)) # file doesn't exist yet
cat("foo", file=tf)
expect_true(is.readable(tf)) # ...but now it does
})
test_that("has_extension works correctly", {
# no extension
tf <- tempfile()
expect_true(has_extension(tf, ""))
expect_false(has_extension(tf, "x"))
# normal extension
ext <- "test"
tf <- tempfile(fileext=paste0(".", ext))
expect_true(has_extension(tf, ext))
expect_false(has_extension(tf, paste0(ext, "x")))
# empty extension
ext <- ""
tf <- tempfile(fileext=paste0(".", ext))
expect_true(has_extension(tf, ext))
expect_false(has_extension(tf, paste0(ext, "x")))
})
| 1,130 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-on-failure.R | context("on-failure")
test_that("on_failure should work", {
is_red <- function(x) {x == "red"}
on_failure(is_red) <- function(call, env) {
paste0(deparse(call$x), " is not red")
}
res <- see_if(is_red("blue"))
expect_false(res[[1]])
expect_identical(attr(res, "msg"), '"blue" is not red')
})
| 314 | gpl-3.0 |
assertthat | cran-assertthat-b28a7b8/tests/testthat/test-scalar.R | context("Scalar assertions")
test_that("is.scalar works correctly", {
expect_true(is.scalar(1))
expect_true(is.scalar(-1))
expect_true(is.scalar(1.5))
expect_false(is.scalar(1:5))
expect_true(is.scalar('a'))
expect_false(is.scalar(c('a', 'b')))
expect_true(is.scalar(TRUE))
expect_false(is.scalar(c(TRUE, FALSE)))
expect_false(is.scalar(NULL))
expect_true(is.scalar(NA))
expect_true(is.scalar(Inf))
expect_true(is.scalar(-Inf))
})
test_that("is.string works correctly", {
expect_false(is.string(1))
expect_true(is.string('a'))
expect_false(is.string(c('a', 'b')))
expect_false(is.string(TRUE))
expect_false(is.string(NULL))
expect_false(is.string(NA))
expect_false(is.string(Inf))
expect_false(is.string(-Inf))
})
test_that("is.number works correctly", {
expect_true(is.number(1))
expect_true(is.number(-1))
expect_true(is.number(1.5))
expect_false(is.number(1:5))
expect_false(is.number('a'))
expect_false(is.number(TRUE))
expect_false(is.number(NULL))
expect_false(is.number(NA))
expect_true(is.number(Inf))
expect_true(is.number(-Inf))
})
test_that("is.flag works correctly", {
expect_false(is.flag(1))
expect_false(is.flag('a'))
expect_true(is.flag(TRUE))
expect_true(is.flag(FALSE))
expect_false(is.flag(c(TRUE, FALSE)))
expect_false(is.flag(NULL))
expect_equal(is.flag(NA), is.logical(NA)) # not obvious
expect_false(is.flag(Inf))
expect_false(is.flag(-Inf))
})
test_that("is.count works correctly", {
expect_true(is.count(1))
expect_false(is.count(-1))
expect_false(is.count(1.5))
expect_false(is.count(1:5))
expect_false(is.count('a'))
expect_false(is.count(TRUE))
expect_false(is.count(NULL))
expect_false(is.count(NA))
expect_false(is.count(NA_real_))
expect_false(is.count(NA_integer_))
expect_false(is.count(NaN))
expect_true(is.count(Inf))
expect_false(is.count(-Inf))
expect_false(is.count(1e10 + 0.0001))
expect_false(is.count(1e10 - 0.1))
})
| 1,971 | gpl-3.0 |
devtools | cran-devtools-945c660/R/R.R | #' Environment variables to set when calling R
#'
#' Devtools sets a number of environmental variables to ensure consistent
#' between the current R session and the new session, and to ensure that
#' everything behaves the same across systems. It also suppresses a common
#' warning on windows, and sets `NOT_CRAN` so you can tell that your
#' code is not running on CRAN. If `NOT_CRAN` has been set externally, it
#' is not overwritten.
#'
#' @keywords internal
#' @return a named character vector
#' @export
r_env_vars <- function() {
vars <- c(
"R_LIBS" = paste(.libPaths(), collapse = .Platform$path.sep),
"CYGWIN" = "nodosfilewarning",
# When R CMD check runs tests, it sets R_TESTS. When the tests
# themselves run R CMD xxxx, as is the case with the tests in
# devtools, having R_TESTS set causes errors because it confuses
# the R subprocesses. Un-setting it here avoids those problems.
"R_TESTS" = "",
"R_BROWSER" = "false",
"R_PDFVIEWER" = "false"
)
if (is.na(Sys.getenv("NOT_CRAN", unset = NA))) {
vars[["NOT_CRAN"]] <- "true"
}
vars
}
| 1,099 | mit |
devtools | cran-devtools-945c660/R/active.R | find_active_file <- function(arg = "file", call = parent.frame()) {
if (!is_rstudio_running()) {
cli::cli_abort("Argument {.arg {arg}} is missing, with no default", call = call)
}
normalizePath(rstudioapi::getSourceEditorContext()$path)
}
find_test_file <- function(path, call = parent.frame()) {
type <- test_file_type(path)
if (any(is.na(type))) {
file <- path_file(path[is.na(type)])
cli::cli_abort(
"Don't know how to find tests associated with the active file {.file {file}}",
call = call
)
}
is_test <- type == "test"
path[!is_test] <- paste0("tests/testthat/test-", name_source(path[!is_test]), ".R")
path <- unique(path[file_exists(path)])
if (length(path) == 0) {
cli::cli_abort("No test files found", call = call)
}
path
}
test_file_type <- function(path) {
dir <- path_file(path_dir(path))
name <- path_file(path)
ext <- tolower(path_ext(path))
src_ext <- c("c", "cc", "cpp", "cxx", "h", "hpp", "hxx")
type <- rep(NA_character_, length(path))
type[dir == "R" & ext == "r"] <- "R"
type[dir == "testthat" & ext == "r" & grepl("^test", name)] <- "test"
type[dir == "src" & ext %in% src_ext] <- "src"
type
}
# Figure out "name" of a test or source file
name_test <- function(path) {
gsub("^test[-_]", "", name_source(path))
}
name_source <- function(path) {
path_ext_remove(path_file(path))
}
| 1,385 | mit |
devtools | cran-devtools-945c660/R/bash.R | #' Open bash shell in package directory.
#'
#' @template devtools
#' @export
bash <- function(pkg = ".") {
pkg <- as.package(pkg)
withr::with_dir(pkg$path, system("bash"))
}
| 179 | mit |
devtools | cran-devtools-945c660/R/build-manual.R | #' Create package pdf manual
#'
#' @template devtools
#' @param path path in which to produce package manual.
#' If `NULL`, defaults to the parent directory of the package.
#'
#' @seealso [Rd2pdf()]
#' @export
build_manual <- function(pkg = ".", path = NULL) {
pkg <- as.package(pkg)
path <- path %||% path_dir(pkg$path)
name <- paste0(pkg$package, "_", pkg$version, ".pdf", collapse = " ")
tryCatch(msg <- callr::rcmd("Rd2pdf", cmdargs = c(
"--force",
paste0("--output=", path, "/", name),
pkg$path
), fail_on_status = TRUE, stderr = "2>&1", spinner = FALSE),
error = function(e) {
cat(e$stdout)
cli::cli_abort("Failed to build manual")
})
cat(msg$stdout)
invisible(msg)
}
| 715 | mit |
devtools | cran-devtools-945c660/R/build-readme.R | #' Build a Rmarkdown files package
#'
#' `build_rmd()` is a wrapper around [rmarkdown::render()] that first installs
#' a temporary copy of the package, and then renders each `.Rmd` in a clean R
#' session. `build_readme()` locates your `README.Rmd` and builds it into a
#' `README.md`
#'
#' @param files The Rmarkdown files to be rendered.
#' @param path path to the package to build the readme.
#' @param ... additional arguments passed to [rmarkdown::render()]
#' @inheritParams install
#' @inheritParams rmarkdown::render
#' @export
build_rmd <- function(files, path = ".", output_options = list(), ..., quiet = TRUE) {
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
pkg <- as.package(path)
rlang::check_installed("rmarkdown")
save_all()
paths <- files
abs_files <- is_absolute_path(files)
paths[!abs_files] <- path(pkg$path, files[!abs_files])
ok <- file_exists(paths)
if (any(!ok)) {
cli::cli_abort("Can't find file{?s}: {.path {files[!ok]}}.")
}
local_install(pkg, quiet = TRUE)
# Ensure rendering github_document() doesn't generate HTML file
output_options$html_preview <- FALSE
for (path in paths) {
cli::cli_inform(c(i = "Building {.path {path}}"))
callr::r_safe(
function(...) rmarkdown::render(...),
args = list(input = path, ..., output_options = output_options, quiet = quiet),
show = TRUE,
spinner = FALSE,
stderr = "2>&1"
)
}
invisible(TRUE)
}
#' @rdname build_rmd
#' @export
build_readme <- function(path = ".", quiet = TRUE, ...) {
pkg <- as.package(path)
readme_path <- path_abs(dir_ls(pkg$path, ignore.case = TRUE, regexp = "(inst/)?readme[.]rmd", recurse = 1, type = "file"))
if (length(readme_path) == 0) {
cli::cli_abort("Can't find {.file README.Rmd} or {.file inst/README.Rmd}.")
}
if (length(readme_path) > 1) {
cli::cli_abort("Can't have both {.file README.Rmd} and {.file inst/README.Rmd}.")
}
build_rmd(readme_path, path = path, quiet = quiet, ...)
}
| 2,024 | mit |
devtools | cran-devtools-945c660/R/build-site.R | #' Execute \pkg{pkgdown} build_site in a package
#'
#' `build_site()` is a shortcut for [pkgdown::build_site()], it generates the
#' static HTML documentation.
#'
#' @param path path to the package to build the static HTML.
#' @param ... additional arguments passed to [pkgdown::build_site()]
#' @inheritParams install
#'
#' @return NULL
#' @export
build_site <- function(path = ".", quiet = TRUE, ...) {
rlang::check_installed("pkgdown")
save_all()
pkg <- as.package(path)
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
withr::with_temp_libpaths(action = "prefix", code = {
install(pkg = pkg$path, upgrade = "never", reload = FALSE, quiet = quiet)
if (isTRUE(quiet)) {
withr::with_output_sink(
file_temp(),
pkgdown::build_site(pkg = pkg$path, ...)
)
} else {
pkgdown::build_site(pkg = pkg$path, ...)
}
})
}
| 907 | mit |
devtools | cran-devtools-945c660/R/check-devtools.R | #' Custom devtools release checks.
#'
#' This function performs additional checks prior to release. It is called
#' automatically by [release()].
#'
#' @template devtools
#' @keywords internal
#' @export
release_checks <- function(pkg = ".", built_path = NULL) {
pkg <- as.package(pkg)
cat_rule(paste0("Running additional devtools checks for ", pkg$package))
check_version(pkg)
check_dev_versions(pkg)
check_vignette_titles(pkg)
check_news_md(pkg)
check_remotes(pkg)
cat_rule()
}
check_dev_versions <- function(pkg = ".") {
pkg <- as.package(pkg)
dep_list <- pkg[tolower(remotes::standardise_dep(TRUE))]
deps <- do.call("rbind", unname(compact(lapply(dep_list, parse_deps))))
deps <- deps[!is.na(deps$version), , drop = FALSE]
parsed <- lapply(deps$version, function(x) unlist(numeric_version(x)))
lengths <- vapply(parsed, length, integer(1))
last_ver <- vapply(parsed, function(x) x[[length(x)]], integer(1))
is_dev <- lengths == 4 & last_ver >= 9000
check_status(
!any(is_dev),
"dependencies don't rely on dev versions",
paste(
"depends on devel versions of: ",
paste0(deps$name[is_dev], collapse = ", ")
)
)
return(invisible(FALSE))
}
check_version <- function(pkg = ".") {
pkg <- as.package(pkg)
ver <- unlist(numeric_version(pkg$version))
check_status(
length(ver) == 3,
"version number has three components",
paste0("version (", pkg$version, ") should have exactly three components")
)
}
check_vignette_titles <- function(pkg = ".") {
pkg <- as.package(pkg)
vigns <- tools::pkgVignettes(dir = pkg$path)
if (length(vigns$docs) == 0) return()
has_vignette_title <- function(v, n) {
h <- readLines(v, n = n)
any(grepl("Vignette Title", h))
}
v <- stats::setNames(vigns$docs, path_file(vigns$docs))
has_vt <- vapply(v, has_vignette_title, logical(1), n = 30)
check_status(
!any(has_vt),
"vignette titles are not placeholders",
paste0(
"placeholder 'Vignette Title' detected in 'title' field and/or ",
"'VignetteIndexEntry' for: ",
paste(names(has_vt)[has_vt], collapse = ",")
)
)
}
check_news_md <- function(pkg) {
pkg <- as.package(pkg)
news_path <- path(pkg$path, "NEWS.md")
if (!file_exists(news_path)) {
return()
}
ignore_path <- path(pkg$path, ".Rbuildignore")
if (!file_exists(ignore_path)) {
ignore_lines <- character()
} else {
ignore_lines <- readLines(ignore_path)
}
has_news <- grepl("NEWS\\.md", ignore_lines, fixed = TRUE) |
grepl("NEWS.md", ignore_lines, fixed = TRUE)
check_status(
!any(has_news),
"NEWS.md is not ignored",
"NEWS.md now supported by CRAN and doesn't need to be ignored."
)
news_rd_path <- path(pkg$path, "inst/NEWS.Rd")
check_status(
!file_exists(news_rd_path),
"NEWS.Rd does not exist",
"NEWS.md now supported by CRAN, NEWS.Rd can be removed."
)
}
check_remotes <- function(pkg) {
check_status(
!has_dev_remotes(pkg),
"DESCRIPTION doesn't have Remotes field",
"Remotes field should be removed before CRAN submission."
)
}
has_dev_remotes <- function(pkg) {
!is.null(pkg[["remotes"]])
}
check_status <- function(status, name, warning) {
cat("Checking ", name, "...", sep = "")
status <- tryCatch(
if (status) {
cat(" OK\n")
} else {
cat("\n")
cli::cli_inform(c(x = "WARNING: {warning}"))
},
error = function(e) {
cat("\n")
cli::cli_inform(c(x = "ERROR: {conditionMessage(e)}"))
FALSE
}
)
invisible(status)
}
| 3,564 | mit |
devtools | cran-devtools-945c660/R/check-doc.R | #' Check documentation, as `R CMD check` does.
#'
#' This function attempts to run the documentation related checks in the
#' same way that `R CMD check` does. Unfortunately it can't run them
#' all because some tests require the package to be loaded, and the way
#' they attempt to load the code conflicts with how devtools does it.
#'
#' @template devtools
#' @return Nothing. This function is called purely for it's side effects: if
#' no errors there will be no output.
#' @export
#' @examples
#' \dontrun{
#' check_man("mypkg")
#' }
check_man <- function(pkg = ".") {
pkg <- as.package(pkg)
document(pkg)
old <- options(warn = -1)
on.exit(options(old))
cli::cli_inform(c(i = "Checking documentation..."))
check_Rd_contents <- if (getRversion() < "4.1") {
asNamespace("tools")$.check_Rd_contents
} else {
asNamespace("tools")$checkRdContents
}
ok <-
all(
man_message(("tools" %:::% ".check_package_parseRd")(dir = pkg$path)),
man_message(("tools" %:::% ".check_Rd_metadata")(dir = pkg$path)),
man_message(("tools" %:::% ".check_Rd_xrefs")(dir = pkg$path)),
man_message(check_Rd_contents(dir = pkg$path)),
man_message(tools::checkDocFiles(dir = pkg$path)),
man_message(tools::checkDocStyle(dir = pkg$path)),
man_message(tools::checkReplaceFuns(dir = pkg$path)),
man_message(tools::checkS3methods(dir = pkg$path)),
man_message(tools::undoc(dir = pkg$path))
)
if (ok) {
cli::cli_inform(c(v = "No issues detected"))
}
invisible()
}
man_message <- function(x) {
if (inherits(x, "undoc") && length(x$code) == 0) {
# Returned by tools::undoc()
TRUE
} else if ("bad" %in% names(x) && length(x$bad) == 0) {
# Returned by check_Rd_xrefs()
TRUE
} else if (length(x) == 0) {
TRUE
} else {
print(x)
FALSE
}
}
| 1,843 | mit |
devtools | cran-devtools-945c660/R/check-git.R | #' Git checks.
#'
#' This function performs Git checks checks prior to release. It is called
#' automatically by [release()].
#'
#' @template devtools
#' @keywords internal
git_checks <- function(pkg = ".") {
pkg <- as.package(pkg)
cat_rule(paste0("Running Git checks for ", pkg$package))
git_report_branch(pkg)
git_check_uncommitted(pkg)
cat_rule()
}
git_report_branch <- function(pkg) {
cat("Current branch:", git_branch(pkg$path), "\n")
}
git_check_uncommitted <- function(pkg) {
check_status(
!git_uncommitted(pkg$path),
"uncommitted files",
"All files should be tracked and committed before release. Please add and commit."
)
}
| 664 | mit |
devtools | cran-devtools-945c660/R/check-mac.R | #' Check macOS package
#'
#' This function works by bundling source package, and then uploading to
#' <https://mac.r-project.org/macbuilder/submit.html>. This function returns a
#' link to the page with the check results.
#'
#' @template devtools
#' @inheritParams check_win
#' @param dep_pkgs Additional custom dependencies to install prior to checking the package.
#' @family build functions
#' @return The url with the check results (invisibly)
#' @export
check_mac_release <- function(pkg = ".", dep_pkgs = character(), args = NULL, manual = TRUE, quiet = FALSE, ...) {
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
pkg <- as.package(pkg)
if (!quiet) {
cli::cli_inform(c(
"Building macOS version of {.pkg {pkg$package}} ({pkg$version})",
i = "Using https://mac.r-project.org/macbuilder/submit.html."
))
}
built_path <- pkgbuild::build(pkg$path, tempdir(),
args = args,
manual = manual, quiet = quiet, ...
)
dep_built_paths <- character()
for (i in seq_along(dep_pkgs)) {
dep_pkg <- as.package(dep_pkgs[[i]])$path
dep_built_paths[[i]] <- pkgbuild::build(dep_pkg, tempdir(),
args = args,
manual = manual, quiet = quiet, ...
)
}
on.exit(file_delete(c(built_path, dep_built_paths)), add = TRUE)
url <- "https://mac.r-project.org/macbuilder/v1/submit"
body <- list(pkgfile = httr::upload_file(built_path))
if (length(dep_built_paths) > 0) {
uploads <- lapply(dep_built_paths, httr::upload_file)
names(uploads) <- rep("depfiles", length(uploads))
body <- append(body, uploads)
}
res <- httr::POST(url,
body = body,
headers = list(
"Content-Type" = "multipart/form-data"
),
encode = "multipart"
)
httr::stop_for_status(res, task = "Uploading package")
response_url <- httr::content(res)$url
if (!quiet) {
time <- strftime(Sys.time() + 10 * 60, "%I:%M %p")
cli::cat_rule(col = "cyan")
cli::cli_inform(c(
i = "Check {.url {response_url}} the results in 5-10 mins (~{time})."
))
}
invisible(response_url)
}
| 2,096 | mit |
devtools | cran-devtools-945c660/R/check-win.R | #' Build windows binary package.
#'
#' This function works by bundling source package, and then uploading to
#' <https://win-builder.r-project.org/>. Once building is complete you'll
#' receive a link to the built package in the email address listed in the
#' maintainer field. It usually takes around 30 minutes. As a side effect,
#' win-build also runs `R CMD check` on the package, so `check_win`
#' is also useful to check that your package is ok on windows.
#'
#' @template devtools
#' @inheritParams pkgbuild::build
#' @param manual Should the manual be built?
#' @param email An alternative email to use, default `NULL` uses the package
#' Maintainer's email.
#' @param quiet If `TRUE`, suppresses output.
#' @param ... Additional arguments passed to [pkgbuild::build()].
#' @family build functions
#' @name check_win
NULL
#' @describeIn check_win Check package on the development version of R.
#' @export
check_win_devel <- function(pkg = ".", args = NULL, manual = TRUE, email = NULL, quiet = FALSE, ...) {
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
check_win(
pkg = pkg, version = "R-devel", args = args, manual = manual,
email = email, quiet = quiet, ...
)
}
#' @describeIn check_win Check package on the release version of R.
#' @export
check_win_release <- function(pkg = ".", args = NULL, manual = TRUE, email = NULL, quiet = FALSE, ...) {
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
check_win(
pkg = pkg, version = "R-release", args = args, manual = manual,
email = email, quiet = quiet, ...
)
}
#' @describeIn check_win Check package on the previous major release version of R.
#' @export
check_win_oldrelease <- function(pkg = ".", args = NULL, manual = TRUE, email = NULL, quiet = FALSE, ...) {
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
check_win(
pkg = pkg, version = "R-oldrelease", args = args, manual = manual,
email = email, quiet = quiet, ...
)
}
check_win <- function(pkg = ".", version = c("R-devel", "R-release", "R-oldrelease"),
args = NULL, manual = TRUE, email = NULL, quiet = FALSE, ...) {
pkg <- as.package(pkg)
if (!is.null(email)) {
desc_file <- path(pkg$path, "DESCRIPTION")
backup <- file_temp()
file_copy(desc_file, backup)
on.exit(file_move(backup, desc_file), add = TRUE)
change_maintainer_email(desc_file, email, call = parent.frame())
pkg <- as.package(pkg$path)
}
version <- match.arg(version, several.ok = TRUE)
if (!quiet) {
cli::cli_inform(c(
"Building windows version of {.pkg {pkg$package}} ({pkg$version})",
i = "Using {paste(version, collapse = ', ')} with win-builder.r-project.org."
))
email <- maintainer(pkg)$email
if (interactive() && yesno("Email results to {.strong {email}}?")) {
return(invisible())
}
}
built_path <- pkgbuild::build(pkg$path, tempdir(),
args = args,
manual = manual, quiet = quiet, ...
)
on.exit(file_delete(built_path), add = TRUE)
url <- paste0(
"ftp://win-builder.r-project.org/", version, "/",
path_file(built_path)
)
lapply(url, upload_ftp, file = built_path)
if (!quiet) {
time <- strftime(Sys.time() + 30 * 60, "%I:%M %p")
email <- maintainer(pkg)$email
cli::cat_rule(col = "cyan")
cli::cli_inform(c(
i = "Check <{.email {email}}> for the results in 15-30 mins (~{time})."
))
}
invisible()
}
change_maintainer_email <- function(path, email, call = parent.frame()) {
desc <- desc::desc(file = path)
if (!desc$has_fields("Authors@R")) {
cli::cli_abort(
"DESCRIPTION must use {.field Authors@R} field when changing {.arg email}",
call = call
)
}
if (desc$has_fields("Maintainer")) {
cli::cli_abort(
"DESCRIPTION can't use {.field Maintainer} field when changing {.arg email}",
call = call
)
}
aut <- desc$get_authors()
roles <- aut$role
## Broken person() API, vector for 1 author, list otherwise...
if (!is.list(roles)) {
roles <- list(roles)
}
is_maintainer <- vapply(roles, function(r) all("cre" %in% r), logical(1))
aut[is_maintainer]$email <- email
desc$set_authors(aut)
desc$write()
}
upload_ftp <- function(file, url, verbose = FALSE) {
rlang::check_installed("curl")
stopifnot(file_exists(file))
stopifnot(is.character(url))
con <- file(file, open = "rb")
on.exit(close(con), add = TRUE)
h <- curl::new_handle(upload = TRUE, filetime = FALSE)
curl::handle_setopt(h, readfunction = function(n) {
readBin(con, raw(), n = n)
}, verbose = verbose)
curl::curl_fetch_memory(url, handle = h)
}
| 4,696 | mit |
devtools | cran-devtools-945c660/R/check.R | #' Build and check a package
#'
#' @description
#' `check()` automatically builds and checks a source package, using all known
#' best practices. `check_built()` checks an already-built package.
#'
#' Passing `R CMD check` is essential if you want to submit your package to
#' CRAN: you must not have any ERRORs or WARNINGs, and you want to ensure that
#' there are as few NOTEs as possible. If you are not submitting to CRAN, at
#' least ensure that there are no ERRORs or WARNINGs: these typically represent
#' serious problems.
#'
#' `check()` automatically builds a package before calling `check_built()`, as
#' this is the recommended way to check packages. Note that this process runs
#' in an independent R session, so nothing in your current workspace will affect
#' the process. Under-the-hood, `check()` and `check_built()` rely on
#' [pkgbuild::build()] and [rcmdcheck::rcmdcheck()].
#'
#' @section Environment variables:
#'
#' Devtools does its best to set up an environment that combines best practices
#' with how check works on CRAN. This includes:
#'
#' \itemize{
#'
#' \item The standard environment variables set by devtools:
#' [r_env_vars()]. Of particular note for package tests is the
#' `NOT_CRAN` env var which lets you know that your tests are not
#' running on CRAN, and hence can take a reasonable amount of time.
#'
#' \item Debugging flags for the compiler, set by
#' \code{\link{compiler_flags}(FALSE)}.
#'
#' \item If `aspell` is found `_R_CHECK_CRAN_INCOMING_USE_ASPELL_`
#' is set to `TRUE`. If no spell checker is installed, a warning is
#' issued.)
#'
#' \item env vars set by arguments `incoming`, `remote` and
#' `force_suggests`
#' }
#'
#' @return An object containing errors, warnings, notes, and more.
#' @template devtools
#' @inheritParams rcmdcheck::rcmdcheck
#' @param document By default (`NULL`) will document if your installed
#' roxygen2 version matches the version declared in the `DESCRIPTION`
#' file. Use `TRUE` or `FALSE` to override the default.
#' @param build_args Additional arguments passed to `R CMD build`
#' @param ... Additional arguments passed on to [pkgbuild::build()].
#' @param vignettes If `FALSE`, do not build or check vignettes, equivalent to
#' using `args = '--ignore-vignettes' and `build_args = '--no-build-vignettes'.
#' @param cleanup `r lifecycle::badge("deprecated")` See `check_dir` for details.
#' @seealso [release()] if you want to send the checked package to
#' CRAN.
#' @export
check <- function(pkg = ".",
document = NULL,
build_args = NULL,
...,
manual = FALSE,
cran = TRUE,
remote = FALSE,
incoming = remote,
force_suggests = FALSE,
run_dont_test = FALSE,
args = "--timings",
env_vars = c(NOT_CRAN = "true"),
quiet = FALSE,
check_dir = NULL,
cleanup = deprecated(),
vignettes = TRUE,
error_on = c("never", "error", "warning", "note")) {
pkg <- as.package(pkg)
withr::local_options(list(warn = 1))
save_all()
if (lifecycle::is_present(cleanup)) {
lifecycle::deprecate_stop("1.11.0", "check(cleanup = )")
}
if (missing(error_on) && !interactive()) {
error_on <- "warning"
}
error_on <- match.arg(error_on)
document <- document %||% can_document(pkg)
if (document) {
if (!quiet) {
cat_rule("Documenting", col = "cyan", line = 2)
}
document(pkg, quiet = quiet)
if (!quiet) {
cli::cat_line()
}
}
if (!quiet) {
cat_rule("Building", col = "cyan", line = 2)
show_env_vars(pkgbuild::compiler_flags(FALSE))
}
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
if (identical(vignettes, FALSE)) {
args <- union(args, "--ignore-vignettes")
}
withr::with_envvar(pkgbuild::compiler_flags(FALSE), action = "prefix", {
built_path <- pkgbuild::build(
pkg$path,
tempdir(),
args = build_args,
quiet = quiet,
manual = manual,
vignettes = vignettes,
...
)
on.exit(file_delete(built_path), add = TRUE)
})
check_built(
built_path,
cran = cran,
remote = remote,
incoming = incoming,
force_suggests = force_suggests,
run_dont_test = run_dont_test,
manual = manual,
args = args,
env_vars = env_vars,
quiet = quiet,
check_dir = check_dir,
error_on = error_on
)
}
can_document <- function(pkg) {
required <- pkg$roxygennote
if (is.null(required)) {
# Doesn't use roxygen2 at all
return(FALSE)
}
installed <- packageVersion("roxygen2")
if (required != installed) {
cli::cat_rule("Documenting", col = "red", line = 2)
cli::cli_inform(c(
i = "Installed roxygen2 version ({installed}) doesn't match required ({required})",
x = "{.fun check} will not re-document this package"
))
FALSE
} else {
TRUE
}
}
#' @export
#' @rdname check
#' @param path Path to built package.
#' @param cran if `TRUE` (the default), check using the same settings as CRAN
#' uses. Because this is a moving target and is not uniform across all of
#' CRAN's machine, this is on a "best effort" basis. It is more complicated
#' than simply setting `--as-cran`.
#' @param remote Sets `_R_CHECK_CRAN_INCOMING_REMOTE_` env var. If `TRUE`,
#' performs a number of CRAN incoming checks that require remote access.
#' @param incoming Sets `_R_CHECK_CRAN_INCOMING_` env var. If `TRUE`, performs a
#' number of CRAN incoming checks.
#' @param force_suggests Sets `_R_CHECK_FORCE_SUGGESTS_`. If `FALSE` (the
#' default), check will proceed even if all suggested packages aren't found.
#' @param run_dont_test Sets `--run-donttest` so that examples surrounded in
#' `\donttest{}` are also run. When `cran = TRUE`, this only affects R 3.6 and
#' earlier; in R 4.0, code in `\donttest{}` is always run as part of CRAN
#' submission.
#' @param manual If `FALSE`, don't build and check manual (`--no-manual`).
#' @param env_vars Environment variables set during `R CMD check`
#' @param quiet if `TRUE` suppresses output from this function.
#' @inheritParams rcmdcheck::rcmdcheck
check_built <- function(path = NULL, cran = TRUE,
remote = FALSE, incoming = remote, force_suggests = FALSE,
run_dont_test = FALSE, manual = FALSE, args = "--timings",
env_vars = NULL, check_dir = tempdir(), quiet = FALSE,
error_on = c("never", "error", "warning", "note")) {
if (missing(error_on) && !interactive()) {
error_on <- "warning"
}
error_on <- match.arg(error_on)
pkgname <- gsub("_.*?$", "", path_file(path))
if (cran) {
args <- c("--as-cran", args)
env_vars <- c(
"_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_" = as.character(FALSE),
env_vars
)
}
if (run_dont_test) {
args <- c("--run-donttest", args)
}
if (manual && !pkgbuild::has_latex()) {
cli::cli_inform(c(x = "pdflatex not found! Not building PDF manual"))
manual <- FALSE
}
if (!manual) {
args <- c(args, "--no-manual")
}
env_vars <- check_env_vars(cran, remote, incoming, force_suggests, env_vars)
if (!quiet) {
cat_rule("Checking", col = "cyan", line = 2)
show_env_vars(env_vars)
}
withr::with_envvar(env_vars, action = "replace", {
rcmdcheck::rcmdcheck(path,
quiet = quiet, args = args,
check_dir = check_dir, error_on = error_on
)
})
}
check_env_vars <- function(cran = FALSE, remote = FALSE, incoming = remote,
force_suggests = TRUE, env_vars = character()) {
c(
aspell_env_var(),
# Switch off expensive check for package version
# https://github.com/r-lib/devtools/issues/1271
if (getRversion() >= "3.4.0" && as.numeric(R.version[["svn rev"]]) >= 70944) {
c("_R_CHECK_CRAN_INCOMING_REMOTE_" = as.character(remote))
},
"_R_CHECK_CRAN_INCOMING_" = as.character(incoming),
"_R_CHECK_FORCE_SUGGESTS_" = as.character(force_suggests),
env_vars
)
}
aspell_env_var <- function() {
tryCatch({
utils::aspell(NULL, program = "aspell")
c("_R_CHECK_CRAN_INCOMING_USE_ASPELL_" = "TRUE")
}, error = function(e) character())
}
show_env_vars <- function(env_vars) {
cli::cat_line("Setting env vars:", col = "darkgrey")
cat_bullet(paste0(format(names(env_vars)), ": ", unname(env_vars)), col = "darkgrey")
}
| 8,581 | mit |
devtools | cran-devtools-945c660/R/create.R | #' Create a package
#'
#' @param path A path. If it exists, it is used. If it does not exist, it is
#' created, provided that the parent path exists.
#' @param ... Additional arguments passed to [usethis::create_package()]
#' @inheritParams usethis::create_package
#' @return The path to the created package, invisibly.
#' @export
create <- function(path, ..., open = FALSE) {
usethis::create_package(path, ..., open = open)
}
| 431 | mit |
devtools | cran-devtools-945c660/R/dev-mode.R | #' Activate and deactivate development mode.
#'
#' When activated, `dev_mode` creates a new library for storing installed
#' packages. This new library is automatically created when `dev_mode` is
#' activated if it does not already exist.
#' This allows you to test development packages in a sandbox, without
#' interfering with the other packages you have installed.
#'
#' @param on turn dev mode on (`TRUE`) or off (`FALSE`). If omitted
#' will guess based on whether or not `path` is in
#' [.libPaths()]
#' @param path directory to library.
#' @export
#' @examples
#' \dontrun{
#' dev_mode()
#' dev_mode()
#' }
dev_mode <- local({
.prompt <- NULL
function(on = NULL, path = getOption("devtools.path")) {
lib_paths <- .libPaths()
path <- path_real(path)
if (is.null(on)) {
on <- !(path %in% lib_paths)
}
if (on) {
if (!file_exists(path)) {
dir_create(path)
}
if (!file_exists(path)) {
cli::cli_abort("Failed to create {.path {path}}")
}
if (!is_library(path)) {
cli::cli_warn(c(
"{.path {path}} does not appear to be a library.",
"Are sure you specified the correct directory?"
))
}
cli::cli_inform(c(v = "Dev mode: ON"))
options(dev_path = path)
if (is.null(.prompt)) .prompt <<- getOption("prompt")
options(prompt = paste("d> "))
.libPaths(c(path, lib_paths))
} else {
cli::cli_inform(c(v = "Dev mode: OFF"))
options(dev_path = NULL)
if (!is.null(.prompt)) options(prompt = .prompt)
.prompt <<- NULL
.libPaths(setdiff(lib_paths, path))
}
}
})
is_library <- function(path) {
# empty directories can be libraries
if (length(dir_ls(path)) == 0) return(TRUE)
# otherwise check that the directories are compiled R directories -
# i.e. that they contain a Meta directory
dirs <- dir_ls(path, type = "directory")
has_pkg_dir <- function(path) length(dir_ls(path, regexp = "Meta")) > 0
help_dirs <- vapply(dirs, has_pkg_dir, logical(1))
all(help_dirs)
}
| 2,070 | mit |
devtools | cran-devtools-945c660/R/devtools-package.R | #' @section Package options:
#'
#' Devtools uses the following [options()] to configure behaviour:
#'
#' \itemize{
#' \item `devtools.path`: path to use for [dev_mode()]
#'
#' \item `devtools.name`: your name, used when signing draft
#' emails.
#'
#' \item `devtools.install.args`: a string giving extra arguments passed
#' to `R CMD install` by [install()].
#' }
#' @docType package
#' @keywords internal
"_PACKAGE"
## usethis namespace: start
#' @importFrom lifecycle deprecated
#' @importFrom miniUI miniPage
#' @importFrom profvis profvis
#' @importFrom urlchecker url_check
## usethis namespace: end
NULL
| 625 | mit |
devtools | cran-devtools-945c660/R/document.R | #' Use roxygen to document a package.
#'
#' This function is a wrapper for the [roxygen2::roxygenize()]
#' function from the roxygen2 package. See the documentation and vignettes of
#' that package to learn how to use roxygen.
#'
#' @template devtools
#' @inheritParams roxygen2::roxygenise
#' @param quiet if `TRUE` suppresses output from this function.
#' @seealso [roxygen2::roxygenize()],
#' `browseVignettes("roxygen2")`
#' @export
document <- function(pkg = ".", roclets = NULL, quiet = FALSE) {
pkg <- as.package(pkg)
if (!isTRUE(quiet)) {
cli::cli_inform(c(i = "Updating {.pkg {pkg$package}} documentation"))
}
save_all()
if (pkg$package == "roxygen2") {
# roxygen2 crashes if it reloads itself
load_all(pkg$path, quiet = quiet)
}
if (quiet) {
output <- file_temp()
withr::defer(file_delete(output))
withr::local_output_sink(output)
}
withr::local_envvar(r_env_vars())
roxygen2::roxygenise(pkg$path, roclets)
pkgload::dev_topic_index_reset(pkg$package)
invisible()
}
| 1,028 | mit |
devtools | cran-devtools-945c660/R/git.R | uses_git <- function(path = ".") {
dir_exists(path(path, ".git"))
}
git_branch <- function(path = ".") {
withr::local_dir(path)
system2("git", c("rev-parse", "--abbrev-ref", "HEAD"), stdout = TRUE)
}
git_uncommitted <- function(path = ".") {
withr::local_dir(path)
out <- system2("git", c("status", "--porcelain=v1"), stdout = TRUE)
length(out) > 0
}
| 367 | mit |
devtools | cran-devtools-945c660/R/has-tests.R | #' Was devtools installed with tests?
#'
#' @keywords internal
#' @export
has_tests <- function() {
test_path <- tryCatch(
path_package("devtools", "tests"),
error = function(e) NULL
)
!is.null(test_path)
}
| 222 | mit |
devtools | cran-devtools-945c660/R/install.R | #' Install a local development package.
#'
#' Uses `R CMD INSTALL` to install the package. Will also try to install
#' dependencies of the package from CRAN, if they're not already installed.
#'
#' If `quick = TRUE`, installation takes place using the current package
#' directory. If you have compiled code, this means that artefacts of
#' compilation will be created in the `src/` directory. If you want to avoid
#' this, you can use `build = TRUE` to first build a package bundle and then
#' install it from a temporary directory. This is slower, but keeps the source
#' directory pristine.
#'
#' If the package is loaded, it will be reloaded after installation. This is
#' not always completely possible, see [reload()] for caveats.
#'
#' To install a package in a non-default library, use [withr::with_libpaths()].
#'
#' @template devtools
#' @inheritParams remotes::install_local
#' @param reload if `TRUE` (the default), will automatically reload the
#' package after installing.
#' @param quick if `TRUE` skips docs, multiple-architectures,
#' demos, and vignettes, to make installation as fast as possible.
#' @param build if `TRUE` [pkgbuild::build()]s the package first:
#' this ensures that the installation is completely clean, and prevents any
#' binary artefacts (like \file{.o}, `.so`) from appearing in your local
#' package directory, but is considerably slower, because every compile has
#' to start from scratch.
#' @param args An optional character vector of additional command line
#' arguments to be passed to `R CMD INSTALL`. This defaults to the
#' value of the option `"devtools.install.args"`.
#' @param build_vignettes if `TRUE`, will build vignettes. Normally it is
#' `build` that's responsible for creating vignettes; this argument makes
#' sure vignettes are built even if a build never happens (i.e. because
#' `build = FALSE`).
#' @param keep_source If `TRUE` will keep the srcrefs from an installed
#' package. This is useful for debugging (especially inside of RStudio).
#' It defaults to the option `"keep.source.pkgs"`.
#' @param ... additional arguments passed to [remotes::install_deps()]
#' when installing dependencies.
#' @family package installation
#' @seealso [update_packages()] to update installed packages from the
#' source location and [with_debug()] to install packages with
#' debugging flags set.
#' @export
install <-
function(pkg = ".", reload = TRUE, quick = FALSE, build = !quick,
args = getOption("devtools.install.args"), quiet = FALSE,
dependencies = NA, upgrade = "default",
build_vignettes = FALSE,
keep_source = getOption("keep.source.pkgs"),
force = FALSE,
...) {
pkg <- as.package(pkg)
# Forcing all of the promises for the current namespace now will avoid lazy-load
# errors when the new package is installed overtop the old one.
# https://stat.ethz.ch/pipermail/r-devel/2015-December/072150.html
if (reload && is_loaded(pkg)) {
eapply(pkgload::ns_env(pkg$package), force, all.names = TRUE)
}
if (isTRUE(build_vignettes)) {
# we likely need all Suggested dependencies if building vignettes
dependencies <- TRUE
build_opts <- c("--no-resave-data", "--no-manual")
} else {
build_opts <- c("--no-resave-data", "--no-manual", "--no-build-vignettes")
}
opts <- c(
if (keep_source) "--with-keep.source",
"--install-tests"
)
if (quick) {
opts <- c(opts, "--no-docs", "--no-multiarch", "--no-demo")
}
opts <- c(opts, args)
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
remotes::install_deps(pkg$path,
build = build, build_opts = build_opts,
INSTALL_opts = opts, dependencies = dependencies, quiet = quiet,
force = force, upgrade = upgrade, ...
)
if (build) {
install_path <- pkgbuild::build(pkg$path, dest_path = tempdir(), args = build_opts, quiet = quiet)
on.exit(file_delete(install_path), add = TRUE)
} else {
install_path <- pkg$path
}
was_loaded <- is_loaded(pkg)
was_attached <- is_attached(pkg)
if (reload && was_loaded) {
pkgload::unregister(pkg$package)
}
pkgbuild::with_build_tools(required = FALSE,
callr::rcmd("INSTALL", c(install_path, opts), echo = !quiet, show = !quiet, spinner = FALSE, stderr = "2>&1", fail_on_status = TRUE)
)
if (reload && was_loaded) {
if (was_attached) {
require(pkg$package, quietly = TRUE, character.only = TRUE)
} else {
requireNamespace(pkg$package, quietly = TRUE)
}
}
invisible(TRUE)
}
#' Install package dependencies if needed.
#'
#' `install_deps()` will install the
#' user dependencies needed to run the package, `install_dev_deps()` will also
#' install the development dependencies needed to test and build the package.
#' @inheritParams install
#' @inherit remotes::install_deps
#' @export
install_deps <- function(pkg = ".",
dependencies = NA,
repos = getOption("repos"),
type = getOption("pkgType"),
upgrade = c("default", "ask", "always", "never"),
quiet = FALSE,
build = TRUE,
build_opts = c("--no-resave-data", "--no-manual", " --no-build-vignettes"),
...) {
pkg <- as.package(pkg)
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
remotes::install_deps(
pkg$path,
dependencies = dependencies,
repos = repos,
type = type,
upgrade = upgrade,
quiet = quiet,
build = build,
build_opts = build_opts,
...
)
}
#' @rdname install_deps
#' @export
install_dev_deps <- function(pkg = ".",
dependencies = TRUE,
repos = getOption("repos"),
type = getOption("pkgType"),
upgrade = c("default", "ask", "always", "never"),
quiet = FALSE,
build = TRUE,
build_opts = c("--no-resave-data", "--no-manual", " --no-build-vignettes"),
...) {
remotes::update_packages("roxygen2")
pkg <- as.package(pkg)
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
remotes::install_deps(
pkg$path,
dependencies = dependencies,
repos = repos,
type = type,
upgrade = upgrade,
quiet = quiet,
build = build,
build_opts = build_opts,
...
)
}
local_install <- function(pkg = ".", quiet = TRUE, env = parent.frame()) {
pkg <- as.package(pkg)
cli::cli_inform(c(i = "Installing {.pkg {pkg$package}} in temporary library"))
withr::local_temp_libpaths(.local_envir = env)
install(pkg, upgrade = "never", reload = FALSE, quick = TRUE, quiet = quiet)
}
| 7,011 | mit |
devtools | cran-devtools-945c660/R/lint.R | #' Lint all source files in a package
#'
#' The default linters correspond to the style guide at
#' <https://style.tidyverse.org/>, however it is possible to override any or all
#' of them using the `linters` parameter.
#'
#' @template devtools
#' @param cache Store the lint results so repeated lints of the same content use
#' the previous results. Consult the lintr package to learn more about its
#' caching behaviour.
#' @param ... Additional arguments passed to [lintr::lint_package()].
#' @seealso [lintr::lint_package()], [lintr::lint()]
#' @export
lint <- function(pkg = ".", cache = TRUE, ...) {
rlang::check_installed("lintr")
pkg <- as.package(pkg)
cli::cli_inform(c(i = "Linting {.pkg {pkg$package}}"))
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
lintr::lint_package(pkg$path, cache = cache, ...)
}
| 862 | mit |
devtools | cran-devtools-945c660/R/missing-s3.R | #' Find missing s3 exports.
#'
#' The method is heuristic - looking for objs with a period in their name.
#'
#' @template devtools
#' @export
missing_s3 <- function(pkg = ".") {
pkg <- as.package(pkg)
loaded <- load_all(pkg$path)
# Find all S3 methods in package
objs <- ls(envir = loaded$env)
is_s3 <- function(x) roxygen2::is_s3_method(x, env = loaded$env)
s3_objs <- Filter(is_s3, objs)
# Find all S3 methods in NAMESPACE
ns <- pkgload::parse_ns_file(pkg$path)
exports <- paste(ns$S3methods[, 1], ns$S3methods[, 2], sep = ".")
setdiff(s3_objs, exports)
}
| 581 | mit |
devtools | cran-devtools-945c660/R/package-deps.R | #' @importFrom pkgload parse_deps
#' @export
pkgload::parse_deps
#' @importFrom pkgload check_dep_version
#' @export
pkgload::check_dep_version
| 145 | mit |
devtools | cran-devtools-945c660/R/package.R | #' Coerce input to a package.
#'
#' Possible specifications of package:
#' \itemize{
#' \item path
#' \item package object
#' }
#' @param x object to coerce to a package
#' @param create `r lifecycle::badge("deprecated")` Hasn't worked for some time.
#' @export
#' @keywords internal
as.package <- function(x = NULL, create = deprecated()) {
if (is.package(x)) return(x)
if (lifecycle::is_present(create)) {
lifecycle::deprecate_warn("2.5.0", "as.package(create = )")
}
x <- package_file(path = x)
load_pkg_description(x)
}
#' Find file in a package.
#'
#' It always starts by walking up the path until it finds the root directory,
#' i.e. a directory containing `DESCRIPTION`. If it cannot find the root
#' directory, or it can't find the specified path, it will throw an error.
#'
#' @param ... Components of the path.
#' @param path Place to start search for package directory.
#' @keywords internal
#' @export
#' @examples
#' \dontrun{
#' package_file("figures", "figure_1")
#' }
package_file <- function(..., path = ".") {
if (!is.character(path) || length(path) != 1) {
cli::cli_abort("{.arg path} must be a string.")
}
if (!dir_exists(path)) {
cli::cli_abort("{.path {path}} is not a directory.")
}
base_path <- path
path <- strip_slashes(path_real(path))
# Walk up to root directory
while (!has_description(path)) {
path <- path_dir(path)
if (is_root(path)) {
cli::cli_abort(c(
"Could not find package root.",
i = "Is {.path {base_path}} inside a package?"
))
}
}
path(path, ...)
}
has_description <- function(path) {
file_exists(path(path, "DESCRIPTION"))
}
is_root <- function(path) {
identical(path, path_dir(path))
}
strip_slashes <- function(x) {
x <- sub("/*$", "", x)
x
}
# Load package DESCRIPTION into convenient form.
load_pkg_description <- function(path) {
path_desc <- path(path, "DESCRIPTION")
info <- read.dcf(path_desc)[1, ]
Encoding(info) <- 'UTF-8'
desc <- as.list(info)
names(desc) <- tolower(names(desc))
desc$path <- path
structure(desc, class = "package")
}
#' Is the object a package?
#'
#' @keywords internal
#' @export
is.package <- function(x) inherits(x, "package")
# Mockable variant of interactive
interactive <- function() .Primitive("interactive")()
| 2,311 | mit |
devtools | cran-devtools-945c660/R/pkgbuild.R | #' @template devtools
#' @param path Path in which to produce package. If `NULL`, defaults to
#' the parent directory of the package.
#' @inherit pkgbuild::build
#' @note The default `manual = FALSE` is not suitable for a CRAN
#' submission, which may require `manual = TRUE`. Even better, use
#' [submit_cran()] or [release()].
#' @param ... Additional arguments passed to [pkgbuild::build].
#' @export
build <- function(pkg = ".", path = NULL, binary = FALSE, vignettes = TRUE,
manual = FALSE, args = NULL, quiet = FALSE, ...) {
save_all()
if (!file_exists(pkg)) {
cli::cli_abort("{.arg pkg} must exist")
}
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
pkgbuild::build(
path = pkg, dest_path = path, binary = binary,
vignettes = vignettes, manual = manual, args = args, quiet = quiet, ...
)
}
#' @importFrom pkgbuild with_debug
#' @export
pkgbuild::with_debug
#' @importFrom pkgbuild clean_dll
#' @export
pkgbuild::clean_dll
#' @importFrom pkgbuild has_devel
#' @export
pkgbuild::has_devel
#' @importFrom pkgbuild find_rtools
#' @export
pkgbuild::find_rtools
| 1,149 | mit |
devtools | cran-devtools-945c660/R/pkgload.R | #' @inherit pkgload::load_all
#' @param ... Additional arguments passed to [pkgload::load_all()].
#' @export
load_all <- function(path = ".", reset = TRUE, recompile = FALSE,
export_all = TRUE, helpers = TRUE, quiet = FALSE, ...) {
if (inherits(path, "package")) {
path <- path$path
}
save_all()
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
pkgload::load_all(
path = path, reset = reset, recompile = recompile,
export_all = export_all, helpers = helpers, quiet = quiet, ...
)
}
#' @importFrom pkgload unload
#' @export
pkgload::unload
| 616 | mit |
devtools | cran-devtools-945c660/R/r-hub.R |
#' Run CRAN checks for package on R-hub
#'
#' It runs [build()] on the package, with the arguments specified
#' in `args`, and then submits it to the R-hub builder at
#' <https://builder.r-hub.io>. The `interactive` option controls
#' whether the function waits for the check output. Regardless, after the
#' check is complete, R-hub sends an email with the results to the package
#' maintainer.
#'
#' @section About email validation on r-hub:
#' To build and check R packages on R-hub, you need to validate your
#' email address. This is because R-hub sends out emails about build
#' results. See more at [rhub::validate_email()].
#'
#' @param platforms R-hub platforms to run the check on. If `NULL`
#' uses default list of CRAN checkers (one for each major platform, and
#' one with extra checks if you have compiled code). You can also specify
#' your own, see [rhub::platforms()] for a complete list.
#' @param email email address to notify, defaults to the maintainer
#' address in the package.
#' @param interactive whether to show the status of the build
#' interactively. R-hub will send an email to the package maintainer's
#' email address, regardless of whether the check is interactive or not.
#' @param build_args Arguments passed to `R CMD build`
#' @param ... extra arguments, passed to [rhub::check_for_cran()].
#' @inheritParams check
#' @family build functions
#' @return a `rhub_check` object.
#'
#' @export
check_rhub <- function(pkg = ".",
platforms = NULL,
email = NULL,
interactive = TRUE,
build_args = NULL,
...) {
rlang::check_installed("rhub")
pkg <- as.package(pkg)
built_path <- build(pkg$path, tempdir(), quiet = !interactive,
args = build_args)
on.exit(file_delete(built_path), add = TRUE)
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
status <- rhub::check_for_cran(
path = built_path,
email = email,
platforms = platforms,
show_status = interactive,
...
)
if (!interactive) {
cli::cli_inform(c(v = "R-hub check for package {.pkg {pkg$package}} submitted."))
status
} else {
status
}
}
| 2,256 | mit |
devtools | cran-devtools-945c660/R/release.R | #' Release package to CRAN.
#'
#' Run automated and manual tests, then post package to CRAN.
#'
#' The package release process will:
#'
#' \itemize{
#' \item Confirm that the package passes `R CMD check` on relevant platforms
#' \item Confirm that important files are up-to-date
#' \item Build the package
#' \item Submit the package to CRAN, using comments in "cran-comments.md"
#' }
#'
#' You can add arbitrary extra questions by defining an (un-exported) function
#' called `release_questions()` that returns a character vector
#' of additional questions to ask.
#'
#' You also need to read the CRAN repository policy at
#' 'https://cran.r-project.org/web/packages/policies.html' and make
#' sure you're in line with the policies. `release` tries to automate as
#' many of polices as possible, but it's impossible to be completely
#' comprehensive, and they do change in between releases of devtools.
#'
#' @template devtools
#' @param check if `TRUE`, run checking, otherwise omit it. This
#' is useful if you've just checked your package and you're ready to
#' release it.
#' @param args An optional character vector of additional command
#' line arguments to be passed to `R CMD build`.
#' @seealso [usethis::use_release_issue()] to create a checklist of release
#' tasks that you can use in addition to or in place of `release`.
#' @export
release <- function(pkg = ".", check = FALSE, args = NULL) {
pkg <- as.package(pkg)
# Figure out if this is a new package
cran_version <- cran_pkg_version(pkg$package)
new_pkg <- is.null(cran_version)
if (yesno("Have you checked for spelling errors (with `spell_check()`)?")) {
return(invisible())
}
if (check) {
cat_rule(
left = "Building and checking",
right = pkg$package,
line = 2
)
check(pkg,
cran = TRUE, remote = TRUE, manual = TRUE,
build_args = args, run_dont_test = TRUE
)
}
if (yesno("Have you run `R CMD check` locally?")) {
return(invisible())
}
release_checks(pkg)
if (yesno("Were devtool's checks successful?")) {
return(invisible())
}
if (!new_pkg) {
show_cran_check <- TRUE
cran_details <- NULL
end_sentence <- " ?"
if (requireNamespace("foghorn", quietly = TRUE)) {
show_cran_check <- has_cran_results(pkg$package)
cran_details <- foghorn::cran_details(pkg = pkg$package)
}
if (show_cran_check) {
if (!is.null(cran_details)) {
end_sentence <- "\n shown above?"
cat_rule(paste0("Details of the CRAN check results for ", pkg$package))
summary(cran_details)
cat_rule()
}
cran_url <- paste0(
cran_mirror(), "/web/checks/check_results_",
pkg$package, ".html"
)
if (yesno("Have you fixed all existing problems at \n{cran_url}{end_sentence}")) {
return(invisible())
}
}
}
if (yesno("Have you checked on R-hub (with `check_rhub()`)?")) {
return(invisible())
}
if (yesno("Have you checked on win-builder (with `check_win_devel()`)?")) {
return(invisible())
}
deps <- if (new_pkg) 0 else length(revdep(pkg$package))
if (deps > 0) {
msg <- paste0(
"Have you checked the ", deps, " reverse dependencies ",
"(with the revdepcheck package)?"
)
if (yesno(msg)) {
return(invisible())
}
}
questions <- c(
"Have you updated `NEWS.md` file?",
"Have you updated `DESCRIPTION`?",
"Have you updated `cran-comments.md?`",
if (file_exists("codemeta.json")) "Have you updated codemeta.json with codemetar::write_codemeta()?",
find_release_questions(pkg)
)
for (question in questions) {
if (yesno(question)) return(invisible())
}
if (uses_git(pkg$path)) {
git_checks(pkg)
if (yesno("Were Git checks successful?")) {
return(invisible())
}
}
submit_cran(pkg, args = args)
invisible(TRUE)
}
has_cran_results <- function(pkg) {
cran_res <- foghorn::cran_results(
pkg = pkg,
show = c("error", "fail", "warn", "note")
)
sum(cran_res[, -1]) > 0
}
find_release_questions <- function(pkg = ".") {
pkg <- as.package(pkg)
q_fun <- pkgload::ns_env(pkg$package)$release_questions
if (is.null(q_fun)) {
character()
} else {
q_fun()
}
}
yesno <- function(msg, .envir = parent.frame()) {
yeses <- c("Yes", "Definitely", "For sure", "Yup", "Yeah", "Of course", "Absolutely")
nos <- c("No way", "Not yet", "I forget", "No", "Nope", "Uhhhh... Maybe?")
cli::cli_inform(msg, .envir = .envir)
qs <- c(sample(yeses, 1), sample(nos, 2))
rand <- sample(length(qs))
utils::menu(qs[rand]) != which(rand == 1)
}
maintainer <- function(pkg = ".") {
pkg <- as.package(pkg)
authors <- pkg$`authors@r`
if (!is.null(authors)) {
people <- eval(parse(text = authors))
if (is.character(people)) {
maintainer <- utils::as.person(people)
} else {
maintainer <- Find(function(x) "cre" %in% x$role, people)
}
} else {
maintainer <- pkg$maintainer
if (is.null(maintainer)) {
cli::cli_abort("No maintainer defined in package.")
}
maintainer <- utils::as.person(maintainer)
}
list(
name = paste(maintainer$given, maintainer$family),
email = maintainer$email
)
}
cran_comments <- function(pkg = ".", call = parent.frame()) {
pkg <- as.package(pkg)
path <- path(pkg$path, "cran-comments.md")
if (!file_exists(path)) {
cli::cli_warn(
c(
x = "Can't find {.file cran-comments.md}.",
i = "This file is used to communicate your release process to the CRAN team.",
i = "Create it with {.code use_cran_comments()}."
),
call = call
)
return(character())
}
paste0(readLines(path, warn = FALSE), collapse = "\n")
}
cran_submission_url <- "https://xmpalantir.wu.ac.at/cransubmit/index2.php"
#' Submit a package to CRAN.
#'
#' This uses the new CRAN web-form submission process. After submission, you
#' will receive an email asking you to confirm submission - this is used
#' to check that the package is submitted by the maintainer.
#'
#' It's recommended that you use [release()] rather than this
#' function as it performs more checks prior to submission.
#'
#' @template devtools
#' @inheritParams release
#' @export
#' @keywords internal
submit_cran <- function(pkg = ".", args = NULL) {
if (yesno("Is your email address {maintainer(pkg)$email}?")) {
return(invisible())
}
pkg <- as.package(pkg)
built_path <- pkgbuild::build(pkg$path, tempdir(), manual = TRUE, args = args)
size <- format(as.object_size(file_info(built_path)$size), units = "auto")
cli::cat_rule("Submitting", col = "cyan")
cli::cli_inform(c(
"i" = "Path {.file {built_path}}",
"i" = "File size: {size}"
))
cli::cat_line()
if (yesno("Ready to submit {pkg$package} ({pkg$version}) to CRAN?")) {
return(invisible())
}
upload_cran(pkg, built_path)
usethis::with_project(pkg$path,
flag_release(pkg)
)
}
extract_cran_msg <- function(msg) {
# Remove "CRAN package Submission" and "Submit package to CRAN"
msg <- gsub("CRAN package Submission|Submit package to CRAN", "", msg)
# remove all html tags
msg <- gsub("<[^>]+>", "", msg)
# remove tabs
msg <- gsub("\t+", "", msg)
# Remove extra newlines
msg <- gsub("\n+", "\n", msg)
msg
}
upload_cran <- function(pkg, built_path, call = parent.frame()) {
pkg <- as.package(pkg)
maint <- maintainer(pkg)
comments <- cran_comments(pkg, call = call)
# Initial upload ---------
cli::cli_inform(c(i = "Uploading package & comments"))
body <- list(
pkg_id = "",
name = maint$name,
email = maint$email,
uploaded_file = httr::upload_file(built_path, "application/x-gzip"),
comment = comments,
upload = "Upload package"
)
r <- httr::POST(cran_submission_url, body = body)
# If a 404 likely CRAN is closed for maintenance, try to get the message
if (httr::status_code(r) == 404) {
msg <- ""
try({
r2 <- httr::GET(sub("index2", "index", cran_submission_url))
msg <- extract_cran_msg(httr::content(r2, "text"))
})
cli::cli_abort(
c(
"*" = "Submission failed",
"x" = msg
),
call = call
)
}
httr::stop_for_status(r)
new_url <- httr::parse_url(r$url)
# Confirmation -----------
cli::cli_inform(c(i = "Confirming submission"))
body <- list(
pkg_id = new_url$query$pkg_id,
name = maint$name,
email = maint$email,
policy_check = "1/",
submit = "Submit package"
)
r <- httr::POST(cran_submission_url, body = body)
httr::stop_for_status(r)
new_url <- httr::parse_url(r$url)
if (new_url$query$submit == "1") {
cli::cli_inform(c(
"v" = "Package submission successful",
"i" = "Check your email for confirmation link."
))
} else {
cli::cli_abort("Package failed to upload.", call = call)
}
invisible(TRUE)
}
as.object_size <- function(x) structure(x, class = "object_size")
flag_release <- function(pkg = ".") {
pkg <- as.package(pkg)
if (!uses_git(pkg$path)) {
return(invisible())
}
cli::cli_inform(c("!" = "Don't forget to tag this release once accepted by CRAN"))
withr::with_dir(pkg$path, {
sha <- system2("git", c("rev-parse", "HEAD"), stdout = TRUE)
})
dat <- list(
Version = pkg$version,
Date = format(Sys.time(), tz = "UTC", usetz = TRUE),
SHA = sha
)
write.dcf(dat, file = path(pkg$path, "CRAN-SUBMISSION"))
usethis::use_build_ignore("CRAN-SUBMISSION")
}
cran_mirror <- function(repos = getOption("repos")) {
repos[repos == "@CRAN@"] <- "https://cloud.r-project.org"
if (is.null(names(repos))) {
names(repos) <- "CRAN"
}
repos[["CRAN"]]
}
# Return the version of a package on CRAN (or other repository)
# @param package The name of the package.
# @param available A matrix of information about packages.
cran_pkg_version <- function(package, available = available.packages()) {
idx <- available[, "Package"] == package
if (any(idx)) {
as.package_version(available[package, "Version"])
} else {
NULL
}
}
| 10,081 | mit |
devtools | cran-devtools-945c660/R/reload.R | #' Unload and reload package.
#'
#' This attempts to unload and reload an _installed_ package. If the package is
#' not loaded already, it does nothing. It's not always possible to cleanly
#' unload a package: see the caveats in [unload()] for some of the potential
#' failure points. If in doubt, restart R and reload the package with
#' [library()].
#'
#' @template devtools
#' @param quiet if `TRUE` suppresses output from this function.
#' @seealso [load_all()] to load a package for interactive development.
#' @examples
#' \dontrun{
#' # Reload package that is in current directory
#' reload(".")
#'
#' # Reload package that is in ./ggplot2/
#' reload("ggplot2/")
#'
#' # Can use inst() to find the package path
#' # This will reload the installed ggplot2 package
#' reload(pkgload::inst("ggplot2"))
#' }
#' @export
reload <- function(pkg = ".", quiet = FALSE) {
pkg <- as.package(pkg)
if (is_attached(pkg)) {
if (!quiet) cli::cli_inform(c(i = "Reloading attached {.pkg {pkg$package}}"))
pkgload::unload(pkg$package)
require(pkg$package, character.only = TRUE, quietly = TRUE)
} else if (is_loaded(pkg)) {
if (!quiet) cli::cli_inform(c(i = "Reloading loaded {.pkg {pkg$package}}"))
pkgload::unload(pkg$package)
requireNamespace(pkg$package, quietly = TRUE)
}
}
| 1,300 | mit |
devtools | cran-devtools-945c660/R/remotes.R | #' @importFrom ellipsis check_dots_used
with_ellipsis <- function(fun) {
b <- body(fun)
f <- function(...) {
ellipsis::check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
!! b
}
f <- rlang::expr_interp(f)
body(fun) <- body(f)
fun
}
with_pkgbuild_build_tools <- function(fun) {
b <- body(fun)
pkgbuild_call <- as.call(c(call("::", as.symbol("pkgbuild"), as.symbol("with_build_tools")), b, list(required = FALSE)))
body(fun) <- pkgbuild_call
fun
}
#' Functions re-exported from the remotes package
#'
#' These functions are re-exported from the remotes package. They differ only
#' that the ones in devtools use the [ellipsis] package to ensure all dotted
#' arguments are used.
#'
#' Follow the links below to see the documentation.
#' [remotes::install_bioc()], [remotes::install_bitbucket()], [remotes::install_cran()], [remotes::install_dev()],
#' [remotes::install_git()], [remotes::install_github()], [remotes::install_gitlab()], [remotes::install_local()],
#' [remotes::install_svn()], [remotes::install_url()], [remotes::install_version()], [remotes::update_packages()],
#' [remotes::dev_package_deps()].
#'
#' @importFrom remotes install_bioc
#' @name remote-reexports
#' @keywords internal
#' @export
install_bioc <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_bioc))
#' @importFrom remotes install_bitbucket
#' @rdname remote-reexports
#' @export
install_bitbucket <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_bitbucket))
#' @importFrom remotes install_cran
#' @rdname remote-reexports
#' @export
install_cran <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_cran))
#' @importFrom remotes install_dev
#' @rdname remote-reexports
#' @export
install_dev <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_dev))
#' @importFrom remotes install_git
#' @rdname remote-reexports
#' @export
install_git <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_git))
#' @importFrom remotes install_github
#' @rdname remote-reexports
#' @export
install_github <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_github))
#' @importFrom remotes github_pull
#' @rdname reexports
#' @export
remotes::github_pull
#' @importFrom remotes github_release
#' @rdname reexports
#' @export
remotes::github_release
#' @importFrom remotes install_gitlab
#' @rdname remote-reexports
#' @export
install_gitlab <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_gitlab))
#' @importFrom remotes install_local
#' @rdname remote-reexports
#' @export
install_local <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_local))
#' @importFrom remotes install_svn
#' @rdname remote-reexports
#' @export
install_svn <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_svn))
#' @importFrom remotes install_url
#' @rdname remote-reexports
#' @export
install_url <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_url))
#' @importFrom remotes install_version
#' @rdname remote-reexports
#' @export
install_version <- with_pkgbuild_build_tools(with_ellipsis(remotes::install_version))
#' @importFrom remotes update_packages
#' @rdname remote-reexports
#' @export
update_packages <- with_pkgbuild_build_tools(with_ellipsis(remotes::update_packages))
#' @importFrom remotes dev_package_deps
#' @rdname remote-reexports
#' @export
dev_package_deps <- with_pkgbuild_build_tools(remotes::dev_package_deps)
| 3,438 | mit |
devtools | cran-devtools-945c660/R/revdep.R | #' Reverse dependency tools.
#'
#' Tools to check and notify maintainers of all CRAN and Bioconductor
#' packages that depend on the specified package.
#'
#' The first run in a session will be time-consuming because it must download
#' all package metadata from CRAN and Bioconductor. Subsequent runs will
#' be faster.
#'
#' @param pkg Package name. This is unlike most devtools packages which
#' take a path because you might want to determine dependencies for a package
#' that you don't have installed. If omitted, defaults to the name of the
#' current package.
#' @param ignore A character vector of package names to ignore. These packages
#' will not appear in returned vector.
#' @param dependencies A character vector listing the types of dependencies
#' to follow.
#' @param bioconductor If `TRUE` also look for dependencies amongst
#' Bioconductor packages.
#' @param recursive If `TRUE` look for full set of recursive dependencies.
#' @seealso The [revdepcheck](https://github.com/r-lib/revdepcheck) package can
#' be used to run R CMD check on all reverse dependencies.
#' @export
#' @keywords internal
#' @examples
#' \dontrun{
#' revdep("ggplot2")
#'
#' revdep("ggplot2", ignore = c("xkcd", "zoo"))
#' }
revdep <- function(pkg,
dependencies = c("Depends", "Imports", "Suggests", "LinkingTo"),
recursive = FALSE, ignore = NULL,
bioconductor = FALSE) {
if (missing(pkg)) pkg <- as.package(".")$package
all <- if (bioconductor) packages() else cran_packages()
deps <- tools::dependsOnPkgs(pkg, dependencies, recursive, installed = all)
deps <- setdiff(deps, ignore)
sort_ci(deps)
}
#' @rdname revdep
#' @export
revdep_maintainers <- function(pkg = ".") {
if (missing(pkg)) pkg <- as.package(".")$package
maintainers <- unique(packages()[revdep(pkg), "Maintainer"])
class(maintainers) <- "maintainers"
maintainers
}
#' @export
print.maintainers <- function(x, ...) {
x <- gsub("\n", " ", x)
cat(x, sep = ",\n")
cat("\n")
}
# Package caches ----------------------------------------------------------
cran_packages <- memoise::memoise(
function() {
local <- path_temp("packages.rds")
utils::download.file("https://cran.R-project.org/web/packages/packages.rds", local,
mode = "wb", quiet = TRUE
)
on.exit(file_delete(local))
cp <- readRDS(local)
rownames(cp) <- unname(cp[, 1])
cp
},
~memoise::timeout(30 * 60)
)
bioc_packages <- memoise::memoise(
function(views = paste(BiocManager::repositories()[["BioCsoft"]], "VIEWS", sep = "/")) {
con <- url(views)
on.exit(close(con))
bioc <- read.dcf(con)
rownames(bioc) <- bioc[, 1]
bioc
},
~memoise::timeout(30 * 60)
)
packages <- function() {
bioc <- bioc_packages()
cran <- cran_packages()
cols <- intersect(colnames(cran), colnames(bioc))
rbind(cran[, cols], bioc[, cols])
}
| 2,907 | mit |
devtools | cran-devtools-945c660/R/run-examples.R | #' Run all examples in a package.
#'
#' One of the most frustrating parts of `R CMD check` is getting all of your
#' examples to pass - whenever one fails you need to fix the problem and then
#' restart the whole process. This function makes it a little easier by
#' making it possible to run all examples from an R function.
#'
#' @template devtools
#' @inheritParams pkgload::run_example
#' @param start Where to start running the examples: this can either be the
#' name of `Rd` file to start with (with or without extensions), or
#' a topic name. If omitted, will start with the (lexicographically) first
#' file. This is useful if you have a lot of examples and don't want to
#' rerun them every time you fix a problem.
#' @family example functions
#' @param show DEPRECATED.
#' @param fresh if `TRUE`, will be run in a fresh R session. This has
#' the advantage that there's no way the examples can depend on anything in
#' the current session, but interactive code (like [browser()])
#' won't work.
#' @param document if `TRUE`, [document()] will be run to ensure
#' examples are updated before running them.
#' @keywords programming
#' @export
run_examples <- function(pkg = ".", start = NULL, show = deprecated(), run_donttest = FALSE,
run_dontrun = FALSE, fresh = FALSE, document = TRUE,
run = deprecated(), test = deprecated()) {
if (!missing(run)) {
lifecycle::deprecate_warn("2.3.1", "run_examples(run)", 'run_example(run_dontrun)')
run_dontrun <- run
}
if (!missing(test)) {
lifecycle::deprecate_warn("2.3.1", "run_examples(test)", 'run_example(run_donttest)')
run_donttest <- test
}
if (!missing(show)) {
lifecycle::deprecate_warn("2.3.1", "run_examples(show)")
}
pkg <- as.package(pkg)
if (fresh) {
to_run <-
function(path, start, run_donttest, run_dontrun) devtools::run_examples(pkg = path, start = start, run_donttest = run_donttest, run_dontrun = run_dontrun, document = FALSE)
callr::r(to_run, args = list(path = pkg$path, start = start, run_donttest = run_donttest, run_dontrun = run_dontrun), show = TRUE, spinner = FALSE, stderr = "2>&1")
return(invisible())
}
if (document) {
document(pkg)
}
files <- rd_files(pkg$path, start = start)
if (length(files) == 0) {
return()
}
cat_rule(
left = paste0("Running ", length(files), " example files"),
right = pkg$package
)
load_all(pkg$path, reset = TRUE, export_all = FALSE)
on.exit(load_all(pkg$path, reset = TRUE))
lapply(files, pkgload::run_example, run_donttest = run_donttest, run_dontrun = run_dontrun)
invisible()
}
# If an error occurs, should print out the suspect line of code, and offer
# the following options:
# * skip to the next example
# * quit
# * browser
# * rerun example and rerun
# * reload code and rerun
rd_files <- function(pkg = ".", start = NULL) {
pkg <- as.package(pkg)
path_man <- path(pkg$path, "man")
files <- dir_ls(path_man, regexp = "\\.[Rr]d$")
names(files) <- path_file(files)
files <- sort_ci(files)
if (!is.null(start)) {
topic <- pkgload::dev_help(start, dev_packages = pkg$package)
start_path <- path_file(topic$path)
start_pos <- which(names(files) == start_path)
if (length(start_pos) == 1) {
files <- files[-seq(1, start_pos - 1)]
}
}
files
}
| 3,384 | mit |
devtools | cran-devtools-945c660/R/run-source.R | #' Run a script through some protocols such as http, https, ftp, etc.
#'
#' If a SHA-1 hash is specified with the `sha1` argument, then this
#' function will check the SHA-1 hash of the downloaded file to make sure it
#' matches the expected value, and throw an error if it does not match. If the
#' SHA-1 hash is not specified, it will print a message displaying the hash of
#' the downloaded file. The purpose of this is to improve security when running
#' remotely-hosted code; if you have a hash of the file, you can be sure that
#' it has not changed. For convenience, it is possible to use a truncated SHA1
#' hash, down to 6 characters, but keep in mind that a truncated hash won't be
#' as secure as the full hash.
#'
#' @param url url
#' @param ... other options passed to [source()]
#' @param sha1 The (prefix of the) SHA-1 hash of the file at the remote URL.
#' @export
#' @seealso [source_gist()]
#' @examples
#' \dontrun{
#'
#' source_url("https://gist.github.com/hadley/6872663/raw/hi.r")
#'
#' # With a hash, to make sure the remote file hasn't changed
#' source_url("https://gist.github.com/hadley/6872663/raw/hi.r",
#' sha1 = "54f1db27e60bb7e0486d785604909b49e8fef9f9")
#'
#' # With a truncated hash
#' source_url("https://gist.github.com/hadley/6872663/raw/hi.r",
#' sha1 = "54f1db27e60")
#' }
source_url <- function(url, ..., sha1 = NULL) {
stopifnot(is.character(url), length(url) == 1)
rlang::check_installed("digest")
temp_file <- file_temp()
on.exit(file_delete(temp_file), add = TRUE)
request <- httr::GET(url)
httr::stop_for_status(request)
writeBin(httr::content(request, type = "raw"), temp_file)
check_sha1(temp_file, sha1)
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
source(temp_file, ...)
}
check_sha1 <- function(path, sha1) {
file_sha1 <- digest::digest(file = path, algo = "sha1")
if (is.null(sha1)) {
cli::cli_inform(c(i = "SHA-1 hash of file is {.str {file_sha1}}"))
} else {
if (nchar(sha1) < 6) {
cli::cli_abort("{.arg sha1} must be at least 6 characters, not {nchar(sha1)}.")
}
# Truncate file_sha1 to length of sha1
file_sha1 <- substr(file_sha1, 1, nchar(sha1))
if (!identical(file_sha1, sha1)) {
cli::cli_abort(
"{.arg sha1} ({.str {sha1}}) doesn't match SHA-1 hash of downloaded file ({.str {file_sha1}})"
)
}
}
}
#' Run a script on gist
#'
#' \dQuote{Gist is a simple way to share snippets and pastes with others.
#' All gists are git repositories, so they are automatically versioned,
#' forkable and usable as a git repository.}
#' <https://gist.github.com/>
#'
#' @param id either full url (character), gist ID (numeric or character of
#' numeric).
#' @param ... other options passed to [source()]
#' @param filename if there is more than one R file in the gist, which one to
#' source (filename ending in '.R')? Default `NULL` will source the
#' first file.
#' @param sha1 The SHA-1 hash of the file at the remote URL. This is highly
#' recommend as it prevents you from accidentally running code that's not
#' what you expect. See [source_url()] for more information on
#' using a SHA-1 hash.
#' @param quiet if `FALSE`, the default, prints informative messages.
#' @export
#' @seealso [source_url()]
#' @examples
#' \dontrun{
#' # You can run gists given their id
#' source_gist(6872663)
#' source_gist("6872663")
#'
#' # Or their html url
#' source_gist("https://gist.github.com/hadley/6872663")
#' source_gist("gist.github.com/hadley/6872663")
#'
#' # It's highly recommend that you run source_gist with the optional
#' # sha1 argument - this will throw an error if the file has changed since
#' # you first ran it
#' source_gist(6872663, sha1 = "54f1db27e60")
#' # Wrong hash will result in error
#' source_gist(6872663, sha1 = "54f1db27e61")
#'
#' #' # You can speficy a particular R file in the gist
#' source_gist(6872663, filename = "hi.r")
#' source_gist(6872663, filename = "hi.r", sha1 = "54f1db27e60")
#' }
source_gist <- function(id, ..., filename = NULL, sha1 = NULL, quiet = FALSE) {
rlang::check_installed("gh")
stopifnot(length(id) == 1)
url_match <- "((^https://)|^)gist.github.com/([^/]+/)?([0-9a-f]+)$"
if (grepl(url_match, id)) {
# https://gist.github.com/kohske/1654919, https://gist.github.com/1654919,
# or gist.github.com/1654919
id <- regmatches(id, regexec(url_match, id))[[1]][5]
url <- find_gist(id, filename)
} else if (is.numeric(id) || grepl("^[0-9a-f]+$", id)) {
# 1654919 or "1654919"
url <- find_gist(id, filename)
} else {
cli::cli_abort("Invalid gist id specification {.str {id}}")
}
if (!quiet) {
cli::cli_inform(c(i = "Sourcing gist {.str {id}}"))
}
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
source_url(url, ..., sha1 = sha1)
}
find_gist <- function(id, filename = NULL, call = parent.frame()) {
files <- gh::gh("GET /gists/:id", id = id)$files
r_files <- files[grepl("\\.[rR]$", names(files))]
if (length(r_files) == 0) {
cli::cli_abort("No R files found in gist", call = call)
}
if (!is.null(filename)) {
if (!is.character(filename) || length(filename) > 1 || !grepl("\\.[rR]$", filename)) {
cli::cli_abort(
"{.arg filename} must be {.code NULL}, or a single filename ending in .R/.r",
call = call
)
}
which <- match(tolower(filename), tolower(names(r_files)))
if (is.na(which)) {
cli::cli_abort("{.path {filename}} not found in gist", call = call)
}
} else {
if (length(r_files) > 1) {
cli::cli_inform("{length(r_files)} .R files in gist, using first", call = call)
}
which <- 1
}
r_files[[which]]$raw_url
}
| 5,727 | mit |
devtools | cran-devtools-945c660/R/save-all.R | #' Save all documents in an active IDE session.
#'
#' Helper function wrapping IDE-specific calls to save all documents in the
#' active session. In this form, callers of `save_all()` don't need to
#' execute any IDE-specific code. This function can be extended to include
#' other IDE implementations of their equivalent
#' `rstudioapi::documentSaveAll()` methods.
#' @return NULL
save_all <- function() {
if (rstudioapi::hasFun("documentSaveAll")) {
rstudioapi::documentSaveAll()
}
}
| 494 | mit |
devtools | cran-devtools-945c660/R/session-info.R | #' Return a vector of names of attached packages
#' @export
#' @keywords internal
#' @return A data frame with columns package and path, giving the name of
#' each package and the path it was loaded from.
loaded_packages <- function() {
attached <- data.frame(
package = search(),
path = searchpaths(),
stringsAsFactors = FALSE
)
packages <- attached[grepl("^package:", attached$package), , drop = FALSE]
rownames(packages) <- NULL
packages$package <- sub("^package:", "", packages$package)
packages
}
#' Return a vector of names of packages loaded by devtools
#' @export
#' @keywords internal
dev_packages <- function() {
packages <- vapply(
loadedNamespaces(),
function(x) !is.null(pkgload::dev_meta(x)), logical(1)
)
names(packages)[packages]
}
#' @export
#' @importFrom sessioninfo session_info
sessioninfo::session_info
#' @export
#' @importFrom sessioninfo package_info
sessioninfo::package_info
| 947 | mit |
devtools | cran-devtools-945c660/R/show-news.R | #' Show package news
#'
#' @template devtools
#' @param latest if `TRUE`, only show the news for the most recent
#' version.
#' @param ... other arguments passed on to `news`
#' @export
show_news <- function(pkg = ".", latest = TRUE, ...) {
pkg <- as.package(pkg)
news_path <- path(pkg$path, "NEWS")
if (!file_exists(news_path)) {
cli::cli_abort("No NEWS found")
}
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
out <- utils::news(..., db = ("tools" %:::% ".news_reader_default")(news_path))
if (latest) {
ver <- numeric_version(out$Version)
recent <- ver == max(ver)
structure(out[recent, ],
class = class(out),
bad = attr(out, "bad")[recent]
)
} else {
out
}
}
| 752 | mit |
devtools | cran-devtools-945c660/R/sitrep.R | # Supress R CMD check note
#' @importFrom memoise memoise
NULL
rstudio_version_string <- function() {
if (!is_rstudio_running()) {
return(character())
}
rvi <- rstudioapi::versionInfo()
rvi$long_version %||% as.character(rvi$version)
}
check_for_rstudio_updates <- function(os = tolower(Sys.info()[["sysname"]]),
version = rstudio_version_string(),
in_rstudio = is_rstudio_running()) {
if (!in_rstudio) {
return()
}
url <- sprintf(
"https://www.rstudio.org/links/check_for_update?version=%s&os=%s&format=%s&manual=true",
utils::URLencode(version, reserved = TRUE), os, "kvp"
)
tmp <- file_temp()
withr::defer(file_exists(tmp) && nzchar(file_delete(tmp)))
suppressWarnings(
download_ok <- tryCatch({
utils::download.file(url, tmp, quiet = TRUE)
TRUE
}, error = function(e) FALSE)
)
if (!download_ok) {
return(
sprintf("Unable to check for RStudio updates (you're using %s).", version)
)
}
result <- readLines(tmp, warn = FALSE)
result <- strsplit(result, "&")[[1]]
result <- strsplit(result, "=")
# If no values then we are current
if (length(result[[1]]) == 1) {
return()
}
nms <- vcapply(result, `[[`, 1)
values <- vcapply(result, function(x) utils::URLdecode(x[[2]]))
result <- stats::setNames(values, nms)
if (!nzchar(result[["update-version"]])) {
return()
}
return(
sprintf("%s.\nDownload at: %s",
result[["update-message"]],
ui_field(result[["update-url"]])
)
)
}
.r_release <- function() {
R_system_version(rversions::r_release()$version)
}
r_release <- memoise::memoise(.r_release)
#' Report package development situation
#'
#' @template devtools
#' @inheritParams pkgbuild::has_build_tools
#' @description `dev_sitrep()` reports
#' * If R is up to date
#' * If RStudio is up to date
#' * If compiler build tools are installed and available for use
#' * If devtools and its dependencies are up to date
#' * If the package's dependencies are up to date
#'
#' @description Call this function if things seem weird and you're not sure
#' what's wrong or how to fix it. If this function returns no output
#' everything should be ready for package development.
#'
#' @return A named list, with S3 class `dev_sitrep` (for printing purposes).
#' @importFrom usethis ui_code ui_field ui_todo ui_value ui_done ui_path
#' @export
#' @examples
#' \dontrun{
#' dev_sitrep()
#' }
dev_sitrep <- function(pkg = ".", debug = FALSE) {
pkg <- tryCatch(as.package(pkg), error = function(e) NULL)
has_build_tools <- !is_windows || pkgbuild::has_build_tools(debug = debug)
structure(
list(
pkg = pkg,
r_version = getRversion(),
r_path = path_real(R.home()),
r_release_version = r_release(),
has_build_tools = has_build_tools,
rtools_path = if (has_build_tools) pkgbuild::rtools_path(),
devtools_version = packageVersion("devtools"),
devtools_deps = remotes::package_deps("devtools", dependencies = NA),
pkg_deps = if (!is.null(pkg)) { remotes::dev_package_deps(pkg$path, dependencies = TRUE) },
rstudio_version = if (is_rstudio_running()) rstudioapi::getVersion(),
rstudio_msg = check_for_rstudio_updates()
),
class = "dev_sitrep"
)
}
#' @export
print.dev_sitrep <- function(x, ...) {
all_ok <- TRUE
hd_line("R")
kv_line("version", x$r_version)
kv_line("path", x$r_path, path = TRUE)
if (x$r_version < x$r_release_version) {
ui_todo('
{ui_field("R")} is out of date ({ui_value(x$r_version)} vs {ui_value(x$r_release_version)})
')
all_ok <- FALSE
}
if (is_windows) {
hd_line("Rtools")
if (x$has_build_tools) {
kv_line("path", x$rtools_path, path = TRUE)
} else {
ui_todo('
{ui_field("RTools")} is not installed:
Download and install it from: {ui_field("https://cloud.r-project.org/bin/windows/Rtools/")}
')
}
all_ok <- FALSE
}
if (!is.null(x$rstudio_version)) {
hd_line("RStudio")
kv_line("version", x$rstudio_version)
if (!is.null(x$rstudio_msg)) {
ui_todo(x$rstudio_msg)
all_ok <- FALSE
}
}
hd_line("devtools")
kv_line("version", x$devtools_version)
devtools_deps_old <- x$devtools_deps$diff < 0
if (any(devtools_deps_old)) {
ui_todo('
{ui_field("devtools")} or its dependencies out of date:
{paste(ui_value(x$devtools_deps$package[devtools_deps_old]), collapse = ", ")}
Update them with {ui_code("devtools::update_packages(\\"devtools\\")")}
')
all_ok <- FALSE
}
hd_line("dev package")
kv_line("package", x$pkg$package)
kv_line("path", x$pkg$path, path = TRUE)
pkg_deps_old <- x$pkg_deps$diff < 0
if (any(pkg_deps_old)) {
ui_todo('
{ui_field(x$pkg$package)} dependencies out of date:
{paste(ui_value(x$pkg_deps$package[pkg_deps_old]), collapse = ", ")}
Update them with {ui_code("devtools::install_dev_deps()")}
')
all_ok <- FALSE
}
if (all_ok) {
ui_done("
All checks passed
")
}
invisible(x)
}
# Helpers -----------------------------------------------------------------
hd_line <- function(name) {
cat_rule(cli::style_bold(name))
}
kv_line <- function (key, value, path = FALSE) {
if (is.null(value)) {
value <- cli::col_silver("<unset>")
}
else {
if (path) {
value <- ui_path(value, base = NA)
} else {
value <- ui_value(value)
}
}
cli::cat_line(cli::symbol$bullet, " ", key, ": ", value)
}
| 5,602 | mit |
devtools | cran-devtools-945c660/R/spell-check.R | #' Spell checking
#'
#' Runs a spell check on text fields in the package description file, manual
#' pages, and optionally vignettes. Wraps the \link[spelling:spell_check_package]{spelling}
#' package.
#'
#' @export
#' @rdname spell_check
#' @template devtools
#' @param vignettes also check all `rmd` and `rnw` files in the pkg `vignettes` folder
#' @param use_wordlist ignore words in the package [WORDLIST][spelling::get_wordlist] file
spell_check <- function(pkg = ".", vignettes = TRUE, use_wordlist = TRUE) {
rlang::check_installed("spelling")
pkg <- as.package(pkg)
spelling::spell_check_package(pkg = pkg, vignettes = vignettes, use_wordlist = use_wordlist)
}
| 674 | mit |
devtools | cran-devtools-945c660/R/test.R | #' Execute testthat tests in a package
#'
#' @description
#' * `test()` runs all tests in a package. It's a shortcut for
#' [testthat::test_dir()]
#' * `test_active_file()` runs `test()` on the active file.
#' * `test_coverage()` computes test coverage for your package. It's a
#' shortcut for [covr::package_coverage()] plus [covr::report()].
#' * `test_coverage_active_file()` computes test coverage for the active file. It's a
#' shortcut for [covr::file_coverage()] plus [covr::report()].
#'
#' @template devtools
#' @param ... additional arguments passed to wrapped functions.
#' @param file One or more source or test files. If a source file the
#' corresponding test file will be run. The default is to use the active file
#' in RStudio (if available).
#' @inheritParams testthat::test_dir
#' @inheritParams pkgload::load_all
#' @inheritParams run_examples
#' @export
test <- function(pkg = ".", filter = NULL, stop_on_failure = FALSE, export_all = TRUE, ...) {
save_all()
pkg <- as.package(pkg)
if (!uses_testthat(pkg)) {
cli::cli_inform(c(i = "No testing infrastructure found."))
if (!interactive()) {
ui_todo('Setup testing with {ui_code("usethis::use_testthat()")}.')
return(invisible())
}
if (yesno("Create it?")) {
return(invisible())
}
usethis_use_testthat(pkg)
return(invisible())
}
cli::cli_inform(c(i = "Testing {.pkg {pkg$package}}"))
withr::local_envvar(r_env_vars())
load_package <- load_package_if_needed(pkg)
testthat::test_local(
pkg$path,
filter = filter,
stop_on_failure = stop_on_failure,
load_package = load_package,
...
)
}
#' @rdname devtools-deprecated
#' @export
test_file <- function(file = find_active_file(), ...) {
lifecycle::deprecate_soft("2.4.0", "test_file()", "test_active_file()")
test_active_file(file, ...)
}
#' @export
#' @rdname test
test_active_file <- function(file = find_active_file(), ...) {
save_all()
test_files <- find_test_file(file)
pkg <- as.package(path_dir(test_files)[[1]])
withr::local_envvar(r_env_vars())
if (is_rstudio_running()) {
rstudioapi::executeCommand("activateConsole", quiet = TRUE)
}
load_package <- load_package_if_needed(pkg)
testthat::test_file(
test_files,
package = pkg$package,
load_package = load_package,
...
)
}
load_package_if_needed <- function(pkg) {
if (pkg$package == "testthat") {
# Must load testthat outside of testthat so tests are run with
# dev testthat
load_all(pkg$path, quiet = TRUE)
"none"
} else {
"source"
}
}
#' @param show_report Show the test coverage report.
#' @export
#' @rdname test
test_coverage <- function(pkg = ".", show_report = interactive(), ...) {
rlang::check_installed(c("covr", "DT"))
save_all()
pkg <- as.package(pkg)
cli::cli_inform(c(i = "Computing test coverage for {.pkg {pkg$package}}"))
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
withr::local_envvar(r_env_vars())
testthat::local_test_directory(pkg$path, pkg$package)
coverage <- covr::package_coverage(pkg$path, ...)
if (isTRUE(show_report)) {
covr::report(coverage)
}
invisible(coverage)
}
#' @rdname devtools-deprecated
#' @export
test_coverage_file <- function(file = find_active_file(), ...) {
lifecycle::deprecate_soft("2.4.0", "test_coverage()", "test_coverage_active_file()")
test_coverage_active_file(file, ...)
}
#' @rdname test
#' @export
test_coverage_active_file <- function(file = find_active_file(), filter = TRUE, show_report = interactive(), export_all = TRUE, ...) {
rlang::check_installed(c("covr", "DT"))
save_all()
test_files <- find_test_file(file)
pkg <- as.package(path_dir(file)[[1]])
check_dots_used(action = getOption("devtools.ellipsis_action", rlang::warn))
withr::local_envvar(r_env_vars())
testthat::local_test_directory(pkg$path, pkg$package)
reporter <- testthat::local_snapshotter()
reporter$start_file(file, "test")
env <- load_all(pkg$path, quiet = TRUE, export_all = export_all)$env
testthat::with_reporter(reporter, {
coverage <- covr::environment_coverage(env, test_files, ...)
})
if (isTRUE(filter)) {
coverage_name <- name_source(covr::display_name(coverage))
local_name <- name_test(file)
coverage <- coverage[coverage_name %in% local_name]
}
# Use relative paths
attr(coverage, "relative") <- TRUE
attr(coverage, "package") <- pkg
if (isTRUE(show_report)) {
covered <- unique(covr::display_name(coverage))
if (length(covered) == 1) {
covr::file_report(coverage)
} else {
covr::report(coverage)
}
}
invisible(coverage)
}
#' Does a package use testthat?
#'
#' @export
#' @keywords internal
uses_testthat <- function(pkg = ".") {
pkg <- as.package(pkg)
paths <- c(
path(pkg$path, "inst", "tests"),
path(pkg$path, "tests", "testthat")
)
any(dir_exists(paths))
}
| 4,930 | mit |
devtools | cran-devtools-945c660/R/uninstall.R | #' Uninstall a local development package
#'
#' Uses `remove.packages()` to uninstall the package. To uninstall a package
#' from a non-default library, use in combination with [withr::with_libpaths()].
#'
#' @inheritParams install
#' @param unload if `TRUE` (the default), ensures the package is unloaded, prior
#' to uninstalling.
#' @inheritParams utils::remove.packages
#' @export
#' @family package installation
#' @seealso [with_debug()] to install packages with debugging flags set.
uninstall <- function(pkg = ".", unload = TRUE, quiet = FALSE, lib = .libPaths()[[1]]) {
pkg <- as.package(pkg)
if (unload && pkg$package %in% loaded_packages()$package) {
pkgload::unload(pkg$package)
}
if (!quiet) {
cli::cli_inform(c(i = "Uninstalling {.pkg {pkg$package}}"))
}
remove.packages(pkg$package, .libPaths()[[1]])
invisible(TRUE)
}
| 861 | mit |
devtools | cran-devtools-945c660/R/usethis.R | # Some helpers around usethis functions
# we need to import some usethis function so the namespace is loaded when
# devtools is loaded, but not attached.
#' @importFrom usethis use_test
NULL
usethis_use_testthat <- function(pkg) {
usethis::local_project(pkg$path, quiet = FALSE)
usethis::use_testthat()
}
usethis_use_directory <- function(pkg, path, ignore = FALSE) {
usethis::local_project(pkg$path, quiet = TRUE)
usethis::use_directory(path, ignore)
}
usethis_use_git_ignore <- function(pkg, ignores, ignore = FALSE) {
usethis::local_project(pkg$path, quiet = TRUE)
usethis::use_git_ignore(ignores)
}
| 619 | mit |
devtools | cran-devtools-945c660/R/utils.R | compact <- function(x) {
is_empty <- vapply(x, function(x) length(x) == 0, logical(1))
x[!is_empty]
}
"%||%" <- function(a, b) if (!is.null(a)) a else b
"%:::%" <- function(p, f) {
get(f, envir = asNamespace(p))
}
is_windows <- isTRUE(.Platform$OS.type == "windows")
sort_ci <- function(x) {
withr::with_collate("C", x[order(tolower(x), x)])
}
is_loaded <- function(pkg = ".") {
pkg <- as.package(pkg)
pkg$package %in% loadedNamespaces()
}
is_attached <- function(pkg = ".") {
pkg <- as.package(pkg)
!is.null(pkgload::pkg_env(pkg$package))
}
vcapply <- function(x, FUN, ...) {
vapply(x, FUN, FUN.VALUE = character(1), ...)
}
release_bullets <- function() {
c(
'`usethis::use_latest_dependencies(TRUE, "CRAN")`',
NULL
)
}
is_testing <- function() {
identical(Sys.getenv("TESTTHAT"), "true")
}
is_rstudio_running <- function() {
!is_testing() && rstudioapi::isAvailable()
}
| 917 | mit |
devtools | cran-devtools-945c660/R/vignette-r.R | copy_vignettes <- function(pkg, keep_md) {
pkg <- as.package(pkg)
usethis_use_directory(pkg, "doc", ignore = TRUE)
usethis_use_git_ignore(pkg, "/doc/")
doc_dir <- path(pkg$path, "doc")
vignettes <- tools::pkgVignettes(dir = pkg$path, output = TRUE, source = TRUE)
if (length(vignettes$docs) == 0) {
return(invisible())
}
md_outputs <- character()
if (isTRUE(keep_md)) {
md_outputs <- dir_ls(path = vignettes$dir, regexp = "[.]md$")
}
out_mv <- unique(c(
md_outputs,
vignettes$outputs,
unlist(vignettes$sources, use.names = FALSE)
))
out_cp <- vignettes$docs
cli::cli_inform(c(i = "Moving {.file {path_file(out_mv)}} to {.path doc/}"))
file_copy(out_mv, doc_dir, overwrite = TRUE)
file_delete(out_mv)
cli::cli_inform(c(i = "Copying {.file {path_file(out_cp)}} to {.path doc/}"))
file_copy(out_cp, doc_dir, overwrite = TRUE)
# Copy extra files, if needed
extra_files <- find_vignette_extras(pkg)
if (length(extra_files) == 0) {
return(invisible())
}
cli::cli_inform(c(i = "Copying extra files {.file {path_file(extra_files)}} to {.path doc/}"))
file_copy(extra_files, doc_dir)
invisible()
}
find_vignette_extras <- function(pkg = ".") {
pkg <- as.package(pkg)
vig_path <- path(pkg$path, "vignettes")
extras_file <- path(vig_path, ".install_extras")
if (!file_exists(extras_file)) {
return(character())
}
extras <- readLines(extras_file, warn = FALSE)
if (length(extras) == 0) {
return(character())
}
all_files <- path_rel(dir_ls(vig_path, all = TRUE), vig_path)
re <- paste0(extras, collapse = "|")
files <- grep(re, all_files, perl = TRUE, ignore.case = TRUE, value = TRUE)
path_real(path(vig_path, files))
}
| 1,731 | mit |
devtools | cran-devtools-945c660/R/vignettes.R | #' Build package vignettes.
#'
#' Builds package vignettes using the same algorithm that `R CMD build`
#' does. This means including non-Sweave vignettes, using makefiles (if
#' present), and copying over extra files. The files are copied in the 'doc'
#' directory and an vignette index is created in 'Meta/vignette.rds', as they
#' would be in a built package. 'doc' and 'Meta' are added to
#' `.Rbuildignore`, so will not be included in the built package. These
#' files can be checked into version control, so they can be viewed with
#' `browseVignettes()` and `vignette()` if the package has been
#' loaded with `load_all()` without needing to re-build them locally.
#'
#' @template devtools
#' @param quiet If `TRUE`, suppresses most output. Set to `FALSE`
#' if you need to debug.
#' @param install If `TRUE`, install the package before building
#' vignettes.
#' @param keep_md If `TRUE`, move md intermediates as well as rendered
#' outputs. Most useful when using the `keep_md` YAML option for Rmarkdown
#' outputs. See
#' <https://bookdown.org/yihui/rmarkdown/html-document.html#keeping-markdown>.
#' @inheritParams tools::buildVignettes
#' @inheritParams remotes::install_deps
#' @importFrom stats update
#' @keywords programming
#' @seealso [clean_vignettes()] to remove the pdfs in
#' \file{doc} created from vignettes
#' @export
#' @seealso [clean_vignettes()] to remove build tex/pdf files.
build_vignettes <- function(pkg = ".",
dependencies = "VignetteBuilder",
clean = TRUE,
upgrade = "never",
quiet = FALSE,
install = TRUE,
keep_md = TRUE) {
pkg <- as.package(pkg)
save_all()
vigns <- tools::pkgVignettes(dir = pkg$path)
if (length(vigns$docs) == 0) return()
deps <- remotes::dev_package_deps(pkg$path, dependencies)
update(deps, upgrade = upgrade)
if (isTRUE(install)) {
local_install(pkg, quiet = TRUE)
}
cli::cli_inform(c(i = "Building vignettes for {.pkg {pkg$package}}"))
callr::r(
function(...) tools::buildVignettes(...),
args = list(
dir = pkg$path,
clean = clean,
tangle = TRUE,
quiet = quiet
),
show = !quiet,
spinner = FALSE
)
# We need to re-run pkgVignettes now that they are built to get the output
# files as well
cli::cli_inform(c(i = "Copying vignettes"))
vigns <- tools::pkgVignettes(dir = pkg$path, source = TRUE, output = TRUE)
copy_vignettes(pkg, keep_md)
create_vignette_index(pkg, vigns)
invisible(TRUE)
}
create_vignette_index <- function(pkg, vigns) {
cli::cli_inform(c(i = "Building vignette index"))
usethis_use_directory(pkg, "Meta", ignore = TRUE)
usethis_use_git_ignore(pkg, "/Meta/")
vignette_index <- ("tools" %:::% ".build_vignette_index")(vigns)
vignette_index_path <- path(pkg$path, "Meta", "vignette.rds")
saveRDS(vignette_index, vignette_index_path, version = 2L)
}
#' Clean built vignettes.
#'
#' This uses a fairly rudimentary algorithm where any files in \file{doc}
#' with a name that exists in \file{vignettes} are removed.
#'
#' @template devtools
#' @export
clean_vignettes <- function(pkg = ".") {
pkg <- as.package(pkg)
vigns <- tools::pkgVignettes(dir = pkg$path)
if (path_file(vigns$dir) != "vignettes") return()
cli::cli_inform(c(i = "Cleaning built vignettes and index from {.pkg {pkg$package}}"))
doc_path <- path(pkg$path, "doc")
vig_candidates <- if (dir_exists(doc_path)) dir_ls(doc_path) else character()
vig_rm <- vig_candidates[file_name(vig_candidates) %in% file_name(vigns$docs)]
extra_candidates <- path(doc_path, path_file(find_vignette_extras(pkg)))
extra_rm <- extra_candidates[file_exists(extra_candidates)]
meta_path <- path(pkg$path, "Meta")
vig_index_path <- path(meta_path, "vignette.rds")
vig_index_rm <- if (file_exists(vig_index_path)) vig_index_path
to_remove <- c(vig_rm, extra_rm, vig_index_rm)
if (length(to_remove) > 0) {
cli::cli_inform(c(x = "Removing {.file {path_file(to_remove)}}"))
file_delete(to_remove)
}
lapply(c(doc_path, meta_path), dir_delete_if_empty)
invisible(TRUE)
}
dir_delete_if_empty <- function(x) {
if (dir_exists(x) && rlang::is_empty(dir_ls(x))) {
dir_delete(x)
cli::cli_inform(c(x = "Removing {.file {path_file(x)}}"))
}
}
file_name <- function(x) {
if (length(x) == 0) return(NULL)
path_ext_remove(path_file(x))
}
| 4,488 | mit |
devtools | cran-devtools-945c660/R/wd.R | #' Set working directory.
#'
#' @template devtools
#' @param path path within package. Leave empty to change working directory
#' to package directory.
#' @export
wd <- function(pkg = ".", path = "") {
pkg <- as.package(pkg)
path <- path(pkg$path, path)
if (!file_exists(path)) {
cli::cli_abort("{.path {path} does not exist")
}
cli::cli_inform(c(i = "Changing working directory to {.path {path}}"))
setwd(path)
}
| 434 | mit |
devtools | cran-devtools-945c660/R/zzz.R | #' @importFrom utils available.packages contrib.url install.packages
#' installed.packages modifyList packageDescription
#' packageVersion remove.packages
#' @importFrom cli cat_rule cat_bullet
#' @import fs
NULL
#' Deprecated Functions
#'
#' These functions are Deprecated in this release of devtools, they will be
#' marked as Defunct and removed in a future version.
#' @name devtools-deprecated
#' @keywords internal
NULL
devtools_default_options <- list(
devtools.path = "~/R-dev",
devtools.install.args = "",
devtools.ellipsis_action = rlang::warn
)
.onLoad <- function(libname, pkgname) {
op <- options()
toset <- !(names(devtools_default_options) %in% names(op))
if (any(toset)) options(devtools_default_options[toset])
invisible()
}
| 764 | mit |
devtools | cran-devtools-945c660/tests/spelling.R | if(requireNamespace('spelling', quietly = TRUE))
spelling::spell_check_test(vignettes = TRUE, error = FALSE,
skip_on_cran = TRUE)
| 161 | mit |
devtools | cran-devtools-945c660/tests/testthat.R | library(testthat)
library(devtools)
test_check("devtools")
| 60 | mit |
devtools | cran-devtools-945c660/tests/testthat/helper.R | # This is a VERY trimmed down version of create_local_thing from usethis
local_package_create <- function(envir = parent.frame()) {
dir <- withr::local_tempdir(.local_envir = envir)
usethis::ui_silence({
create_package(dir, rstudio = FALSE, open = FALSE, check_name = FALSE)
})
dir
}
local_package_copy <- function(path, env = parent.frame()) {
temp_path <- withr::local_tempdir(.local_envir = env)
dir_copy(path, temp_path, overwrite = TRUE)
temp_path
}
| 477 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-active.R | test_that("find_active_file() gives useful error if no RStudio", {
expect_snapshot(find_active_file(), error = TRUE)
})
test_that("fails if can't find tests", {
expect_snapshot(error = TRUE, {
find_test_file("R/foo.blah")
find_test_file("R/foo.R")
})
})
test_that("can determine file type", {
expect_equal(test_file_type("R/foo.R"), "R")
expect_equal(test_file_type("R/foo.c"), NA_character_)
expect_equal(test_file_type("src/foo.c"), "src")
expect_equal(test_file_type("src/foo.R"), NA_character_)
expect_equal(test_file_type("tests/testthat/test-foo.R"), "test")
expect_equal(test_file_type("tests/testthat/test-foo.c"), NA_character_)
expect_equal(test_file_type("tests/testthat/foo.R"), NA_character_)
expect_equal(test_file_type("DESCRIPTION"), NA_character_)
})
| 803 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-build-readme.R | test_that("can build README in root directory", {
skip_on_cran()
pkg <- local_package_create()
suppressMessages(usethis::with_project(pkg, use_readme_rmd()))
suppressMessages(build_readme(pkg))
expect_true(file_exists(path(pkg, "README.md")))
expect_false(file_exists(path(pkg, "README.html")))
})
test_that("can build README in inst/", {
skip_on_cran()
pkg <- local_package_create()
suppressMessages(usethis::with_project(pkg, use_readme_rmd()))
dir_create(pkg, "inst")
file_move(
path(pkg, "README.Rmd"),
path(pkg, "inst", "README.Rmd")
)
suppressMessages(build_readme(pkg))
expect_true(file_exists(path(pkg, "inst", "README.md")))
expect_false(file_exists(path(pkg, "README.Rmd")))
expect_false(file_exists(path(pkg, "README.md")))
expect_false(file_exists(path(pkg, "inst", "README.html")))
})
test_that("useful errors if too few or too many", {
pkg <- local_package_create()
expect_snapshot(build_readme(pkg), error = TRUE)
suppressMessages(usethis::with_project(pkg, use_readme_rmd()))
dir_create(pkg, "inst")
file_copy(path(pkg, "README.Rmd"), path(pkg, "inst", "README.Rmd"))
expect_snapshot(build_readme(pkg), error = TRUE)
})
| 1,197 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-build-site.R | test_that("Package pkgdown site can be built ", {
# This test requires internet
skip_if_offline()
skip_on_cran()
destination <- path(tempdir(), "testPkgdown", "docs")
build_output <- capture.output({
build_site(
path = "testPkgdown",
override = list(destination = destination)
)
}, type = c("output"))
build_output <- paste(build_output, collapse = "\n")
expect_true(file_exists(path(destination, "index.html")),
info = build_output,
label = "created site index"
)
expect_true(file_exists(path(destination, "reference", "index.html")),
info = build_output,
label = "created reference index"
)
expect_true(file_exists(path(destination, "articles", "index.html")),
info = build_output,
label = "created articles index"
)
expect_true(file_exists(path(destination, "articles", "test.html")),
info = build_output,
label = "created articles index"
)
})
| 933 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-check-doc.R | test_that("check_man works", {
# tools:::.check_Rd_xrefs which is called by `check_man()` assumes the base
# and recommended packages will all be in the library path, which is not the
# case during R CMD check, so we only run these tests interactively
skip_if_not(interactive())
pkg <- local_package_create()
dir.create(file.path(pkg, "man"))
writeLines(c("
\\name{foo}
\\title{Foo bar}
\\usage{
foo(x)
}
\\arguments{\\item{foo}{}}
"), file.path(pkg, "man", "foo.Rd"))
expect_output(
check_man(pkg),
"Undocumented arguments"
)
})
| 551 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-check-win.R | test_that("change_maintainer_email checks fields", {
path <- withr::local_tempfile()
desc <- desc::desc(text = "")
desc$write(path)
expect_snapshot(change_maintainer_email(path, "x@example.com"), error = TRUE)
desc <- desc::desc(text = c(
"Authors@R: person('x', 'y')",
"Maintainer: foo <foo@example.com>"
))
desc$write(path)
expect_snapshot(change_maintainer_email(path, "x@example.com"), error = TRUE)
})
| 433 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-check.R | test_that("can determine when to document", {
expect_false(can_document(list()))
# TODO: switch to expect_snapshot()
suppressMessages(expect_message(
expect_false(can_document(list(roxygennote = "15.0.00"))),
"doesn't match required"
))
expect_true(can_document(list(roxygennote = packageVersion("roxygen2"))))
})
test_that("fail instead of sending an email to wrong recipient", {
# The testTest package has both Authors@R and Maintainer field - this causes problems in change_maintainer_email().
# The function checks if the provided email is actually the one in the maintainer field instead of sending the report to the wrong recipient
expect_error(check_win_release(path("testTest"), email = "foo@bar.com"))
})
| 739 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-install.R | library(mockery)
local({
pkg <- fs::path_real(local_package_create())
path2char <- function(x) {
if (inherits(x, "fs_path")) {
as.character(x)
} else {
x
}
}
expect_passes_args <- function(fn, stub, input_args = list(), expected_args) {
mck <- mockery::mock(NULL)
mockery::stub(fn, stub, mck)
capture.output(suppressMessages(do.call(fn, input_args)))
mockery::expect_called(mck, 1)
mock_args <- mockery::mock_args(mck)[[1]]
mock_args <- lapply(mock_args, path2char)
expect_equal(mock_args, expected_args)
}
custom_args <- list(
dependencies = "dep",
repos = "repo",
type = "type",
upgrade = "upgrade",
quiet = "quiet",
build = "build",
build_opts = "build_opts"
)
dep_defaults <- list(
dependencies = NA,
repos = getOption("repos"),
type = getOption("pkgType"),
upgrade = c("default", "ask", "always", "never"),
quiet = FALSE,
build = TRUE,
build_opts = c("--no-resave-data", "--no-manual", " --no-build-vignettes")
)
dev_dep_defaults <- list(
dependencies = TRUE,
repos = getOption("repos"),
type = getOption("pkgType"),
upgrade = c("default", "ask", "always", "never"),
quiet = FALSE,
build = TRUE,
build_opts = c("--no-resave-data", "--no-manual", " --no-build-vignettes")
)
extra <- list(foo = "foo", bar = "bar")
test_that("install_deps passes default args to remotes::install_deps", {
expect_passes_args(
install_deps,
"remotes::install_deps",
list(pkg),
c(pkg, dep_defaults)
)
})
test_that("install_deps passes custom args to remotes::install_deps", {
expect_passes_args(
install_deps,
"remotes::install_deps",
c(pkg, custom_args),
c(pkg, custom_args)
)
})
test_that("install_deps passes ellipsis args to remotes::install_deps", {
expect_passes_args(
install_deps,
"remotes::install_deps",
c(pkg, extra),
c(pkg, dep_defaults, extra)
)
})
test_that("install_dev_deps passes default args to remotes::install_deps", {
expect_passes_args(
install_dev_deps,
"remotes::install_deps",
list(pkg),
c(pkg, dev_dep_defaults)
)
})
test_that("install_dev_deps passes custom args to remotes::install_deps", {
expect_passes_args(
install_dev_deps,
"remotes::install_deps",
c(pkg, custom_args),
c(pkg, custom_args)
)
})
test_that("install_dev_deps passes ellipsis args to remotes::install_deps", {
expect_passes_args(
install_dev_deps,
"remotes::install_deps",
c(pkg, extra),
c(pkg, dev_dep_defaults, extra)
)
})
})
test_that("vignettes built on install", {
skip_on_cran()
if (!pkgbuild::has_latex()) {
skip("pdflatex not available")
}
pkg <- local_package_copy(test_path("testVignettesBuilt"))
withr::local_temp_libpaths()
install(pkg, reload = FALSE, quiet = TRUE, build_vignettes = TRUE)
vigs <- vignette(package = "testVignettesBuilt")$results
expect_equal(nrow(vigs), 1)
expect_equal(vigs[3], "new")
})
| 2,933 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-package.R | test_that("package_file() gives useful errors", {
expect_snapshot(error = TRUE, {
package_file(path = 1)
package_file(path = "doesntexist")
package_file(path = "/")
})
})
test_that("create argument is deprecated", {
path <- local_package_create()
expect_snapshot(x <- as.package(path, create = TRUE))
})
| 325 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-reload.R | test_that("reload works", {
withr::local_temp_libpaths()
pkg <- as.package(test_path("testTest"))
pkg_name <- pkg$package
install(pkg, quiet = TRUE)
on.exit(unload(pkg$package), add = TRUE)
expect_false(is_loaded(pkg))
# Do nothing if the package is not loaded
expect_error(reload(pkg, quiet = TRUE), NA)
expect_false(is_loaded(pkg))
# Reload if loaded
requireNamespace(pkg_name, quietly = TRUE)
expect_true(is_loaded(pkg))
reload(pkg, quiet = TRUE)
expect_true(is_loaded(pkg))
# Re-attach if attached
unload(pkg$package, quiet = TRUE)
library(pkg_name, character.only = TRUE, quietly = TRUE)
expect_true(is_loaded(pkg))
expect_true(is_attached(pkg))
reload(pkg, quiet = TRUE)
expect_true(is_loaded(pkg))
expect_true(is_attached(pkg))
})
| 788 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-run-examples.R | test_that("Can run an example", {
pkg <- "testHelp"
expect_output(
suppressMessages(run_examples(pkg = pkg, document = FALSE)),
"You called foofoo.",
fixed = TRUE
)
})
| 186 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-run-source.R | test_that("gist containing single file works unambiguously", {
skip_if_offline()
skip_on_cran()
skip_on_ci()
a <- 10
source_gist(
"a65ddd06db40213f1a921237c55defbe",
sha1 = "f176f5e1fe05b69b1ef799fdd1e4bac6341aff51",
local = environment(),
quiet = TRUE
)
expect_equal(a, 1)
})
test_that("gist with multiple files uses first with warning", {
skip_if_offline()
skip_on_cran()
skip_on_ci()
a <- 10
expect_snapshot(
source_gist(
"605a984e764f9ed358556b4ce48cbd08",
sha1 = "f176f5e1fe0",
local = environment()
)
)
expect_equal(a, 1)
})
test_that("errors with bad id", {
expect_snapshot(source_gist("xxxx"), error = TRUE)
})
test_that("can specify filename", {
skip_if_offline()
skip_on_cran()
skip_on_ci()
b <- 10
source_gist(
"605a984e764f9ed358556b4ce48cbd08",
filename = "b.r",
sha1 = "8d1c53241c425a9a52700726809b7f2c164bde72",
local = environment(),
quiet = TRUE
)
expect_equal(b, 2)
})
test_that("error if file doesn't exist or no files", {
skip_if_offline()
skip_on_cran()
skip_on_ci()
expect_snapshot(error = TRUE, {
find_gist("605a984e764f9ed358556b4ce48cbd08", 1)
find_gist("605a984e764f9ed358556b4ce48cbd08", "c.r")
find_gist("c535eee2d02e5f47c8e7642811bc327c")
})
})
test_that("check_sha1() checks or reports sha1 as needed", {
path <- withr::local_tempfile()
writeBin("abc\n", path)
expect_snapshot(error = TRUE, {
check_sha1(path, NULL)
check_sha1(path, "f")
check_sha1(path, "ffffff")
})
})
| 1,560 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-sitrep.R | test_that("check_for_rstudio_updates", {
skip_if_offline()
skip_on_cran()
# the IDE ends up calling this with `os = "mac"` on macOS, but we would send
# "darwin" in that case, so I test with "darwin"
# also mix in some "windows"
# returns nothing rstudio not available
expect_null(check_for_rstudio_updates("darwin", "1.0.0", FALSE))
# returns nothing if the version is ahead of the current version
expect_null(check_for_rstudio_updates("windows", "2030.12.0+123", TRUE))
# returns something if ...
local_edition(3)
scrub_current_version <- function(message) {
sub("(?<=^RStudio )[0-9\\.\\+]+", "{VERSION}", message, perl = TRUE)
}
# version is not understood by the service
expect_snapshot(
writeLines(check_for_rstudio_updates("windows", "haha-no-wut", TRUE))
)
# version is behind the current version
# truly ancient
expect_snapshot(
writeLines(check_for_rstudio_updates("darwin", "0.0.1", TRUE)),
transform = scrub_current_version
)
# Juliet Rose, does not have long_version, last before numbering changed
expect_snapshot(
writeLines(check_for_rstudio_updates("windows", "1.4.1717", TRUE)),
transform = scrub_current_version
)
# new scheme, introduced 2021-08
# YYYY.MM.<patch>[-(daily|preview)]+<build number>[.pro<pro suffix>]
# YYY.MM is th expected date of release for dailies and previews
# an out-of-date preview
expect_snapshot(
writeLines(check_for_rstudio_updates("darwin", "2021.09.1+372", TRUE)),
transform = scrub_current_version
)
# an out-of-date daily
expect_snapshot(
writeLines(check_for_rstudio_updates("windows", "2021.09.0-daily+328", TRUE)),
transform = scrub_current_version
)
})
| 1,721 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-test.R | test_test <- function(...) {
suppressMessages(test(..., reporter = "silent"))
}
test_test_active_file <- function(...) {
suppressMessages(test_active_file(..., reporter = "silent"))
}
test_that("Package can be tested with testthat not on search path", {
pkg1 <- test_path("testTest")
pkg2 <- test_path("testTestWithDepends")
testthat_pos <- which(search() == "package:testthat")
if (length(testthat_pos) > 0) {
testthat_env <- detach(pos = testthat_pos)
on.exit(attach(testthat_env, testthat_pos), add = TRUE)
}
test_test(pkg1)
expect_true(TRUE)
test_test(pkg2)
expect_true(TRUE)
})
test_that("Filtering works with devtools::test", {
out <- test_test(test_path("testTest"), filter = "dummy")
expect_equal(length(out), 1)
})
test_that("devtools::test_active_file works", {
out <- test_test_active_file(test_path("testTest/tests/testthat/test-dummy.R"))
expect_equal(length(out), 1)
})
test_that("TESTTHAT_PKG environment variable is set", {
withr::local_envvar("TESTTHAT_PKG" = "incorrect")
test_test(
test_path("testTest"),
filter = "envvar",
stop_on_failure = TRUE
)
test_active_file(
test_path("testTest/tests/testthat/test-envvar.R"),
stop_on_failure = TRUE
)
expect_true(TRUE)
})
test_that("stop_on_failure defaults to FALSE", {
expect_error(
test_test(test_path("testTestWithFailure")),
NA
)
expect_error(
test_test(test_path("testTestWithFailure"), stop_on_failure = TRUE),
"Test failures"
)
})
| 1,504 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-uninstall.R | test_that("uninstall() unloads and removes from library", {
withr::local_temp_libpaths()
# Install package
install(test_path("testHelp"), quiet = TRUE)
expect_true(require(testHelp, quietly = TRUE))
expect_true("testHelp" %in% loaded_packages()$package)
# Uninstall package
uninstall(test_path("testHelp"), quiet = TRUE)
expect_false("testHelp" %in% loaded_packages()$package)
suppressWarnings(expect_false(require(testHelp, quietly = TRUE)))
})
| 466 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-utils.R | test_that("case-insensitive sort order", {
expect_equal(sort_ci(rev(letters)), letters)
expect_equal(sort_ci(rev(LETTERS)), LETTERS)
expect_equal(sort_ci(c(letters[1:3], LETTERS[1:3])), c("A", "a", "B", "b", "C", "c"))
})
| 228 | mit |
devtools | cran-devtools-945c660/tests/testthat/test-vignettes.R | test_that("Sweave vignettes copied into doc", {
if (!pkgbuild::has_latex()) {
skip("pdflatex not available")
}
pkg <- local_package_copy(test_path("testVignettes"))
suppressMessages(build_vignettes(pkg, quiet = TRUE))
expect_setequal(
path_file(dir_ls(path(pkg, "doc"))),
c("new.pdf", "new.R", "new.Rnw")
)
})
test_that("Built files are updated", {
# This test is time dependant and sometimes fails on CRAN because the systems are under heavy load.
skip_on_cran()
pkg <- local_package_copy(test_path("testMarkdownVignettes"))
suppressMessages(build_vignettes(pkg, quiet = TRUE))
output <- dir_ls(path(pkg, "doc"), regexp = "new")
first <- file_info(output)$modification_time
Sys.sleep(.01)
suppressMessages(build_vignettes(pkg, quiet = TRUE))
second <- file_info(output)$modification_time
expect_true(all(second > first))
})
test_that("Rmarkdown vignettes copied into doc", {
pkg <- local_package_copy(test_path("testMarkdownVignettes"))
doc <- path(pkg, "doc")
suppressMessages(build_vignettes(pkg, quiet = TRUE))
expect_setequal(path_file(dir_ls(doc)), c("test.html", "test.R", "test.Rmd"))
})
test_that("extra files copied and removed", {
pkg <- local_package_copy(test_path("testMarkdownVignettes"))
writeLines("a <- 1", path(pkg, "vignettes", "a.R"))
extras_path <- path(pkg, "vignettes", ".install_extras")
writeLines("a.R", extras_path)
suppressMessages(build_vignettes(pkg, quiet = TRUE))
expect_true(file_exists(path(pkg, "doc", "a.R")))
suppressMessages(clean_vignettes(pkg))
expect_false(file_exists(path(pkg, "doc", "a.R")))
})
test_that(".gitignore updated when building vignettes", {
pkg <- local_package_copy(test_path("testMarkdownVignettes"))
gitignore <- path(pkg, ".gitignore")
suppressMessages(build_vignettes(pkg, quiet = TRUE))
expect_true(all(c("/Meta/", "/doc/") %in% readLines(gitignore)))
})
| 1,908 | mit |
devtools | cran-devtools-945c660/tests/testthat/testCheckExtrafile/R/a.R | #' A number.
#' @export
a <- 1
| 31 | mit |
devtools | cran-devtools-945c660/tests/testthat/testError/R/error.R | f <- function() {
5 * 10
}
stop("This is an error!") # nolint
| 67 | mit |
devtools | cran-devtools-945c660/tests/testthat/testHelp/R/foofoo.R | #' Test function for help
#'
#' The purpose of this function is to test out \code{help} and \code{?} from
#' devtools.
#'
#' @examples
#' stopifnot(foofoo() == 'You called foofoo.')
#' @export
foofoo <- function() "You called foofoo."
| 235 | mit |
devtools | cran-devtools-945c660/tests/testthat/testMissingNsObject/R/a.R | a <- 1
| 7 | mit |
devtools | cran-devtools-945c660/tests/testthat/testPkgdown/R/pkgdown-test-test.R |
#' pkgdown_test_test
#'
#' @param x marks the spot
#'
#' @return FALSE
#' @export
#'
pkgdown_test_test <- function(x) {
return(FALSE)
}
| 139 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTest/R/dummy.R | 1 | mit |
|
devtools | cran-devtools-945c660/tests/testthat/testTest/tests/testthat.R | library(testthat)
library(testTest)
test_check("testTest")
| 60 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTest/tests/testthat/test-dummy.R | test_that("multiplication works", {
expect_equal(2 * 2, 4)
})
| 64 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTest/tests/testthat/test-envvar.R | test_that("TESTTHAT_PKG environment variable is set", {
expect_equal(Sys.getenv("TESTTHAT_PKG"), "testTest")
})
| 114 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTestWithDepends/tests/testthat.R | library(testthat)
library(testTestWithDepends)
test_check("testTestWithDepends")
| 82 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTestWithDepends/tests/testthat/test-dummy.R | test_that("multiplication works", {
expect_equal(2 * 2, 4)
})
| 64 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTestWithFailure/R/dummy.R | 1 | mit |
|
devtools | cran-devtools-945c660/tests/testthat/testTestWithFailure/tests/testthat.R | library(testthat)
library(testTest)
test_check("testTest")
| 60 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTestWithFailure/tests/testthat/test-fail.R | test_that("failing test", {
fail("Broken")
})
| 48 | mit |
devtools | cran-devtools-945c660/tests/testthat/testTestWithFailure/tests/testthat/test-warn.R | test_that("warning from test", {
warning("Beware!") # nolint
})
| 66 | mit |
End of preview.
CRAN packages dataset
R and Rmd source codes for CRAN packages.
The dataset has been constructed using the following steps:
- Downloaded latest version from all packages on CRAN (see last updated). The source code has been downloaded from the GitHub mirror.
- Identified the licenses from each package from their DESCRIPTION file, and classified each of them into some license_code. See the licenses.csv file.
- Extract R and Rmd source files from all packages and joined with the package LICENSES.
Datasets are provided as parquet files containing the following columns:
FileSystemDataset with 1 Parquet file
package: string
path: string
content: large_string
size: double
license: string
Last updated: Jun 6th 2023
Changelog
- v1: Initial version
- dev: added all CRAN files and a license field that allows filtering out per license. Also removed some unused columns.
- Downloads last month
- 80