Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
90 commits
Select commit Hold shift + click to select a range
bb26660
initial support for InterSystems IRIS
bdeboe Feb 29, 2024
02b73bd
reran roxygen
bdeboe Feb 29, 2024
b2ffcf8
use uppercase IRIS in JDBC URL
bdeboe Jun 24, 2024
758ef2f
Fix CRAN badges
schuemie Aug 23, 2024
2cdd99e
Bump actions/download-artifact from 2 to 4.1.7 in /.github/workflows
dependabot[bot] Sep 3, 2024
46eb774
Merge pull request #291 from OHDSI/dependabot/github_actions/dot-gith…
schuemie Sep 4, 2024
05f1f31
initial support for InterSystems IRIS
bdeboe Feb 29, 2024
0b7c045
reran roxygen
bdeboe Feb 29, 2024
05cc5db
use uppercase IRIS in JDBC URL
bdeboe Jun 24, 2024
2c4a46c
Merge branch 'main' of github.com:intersystems-community/OHDSI-Databa…
bdeboe Sep 16, 2024
6f307d9
Note InterSystems IRIS in reference
bdeboe Sep 16, 2024
4283406
Fix section formatting & rerun roxygen
bdeboe Sep 16, 2024
ea6d7af
Merge branch 'main' into develop
schuemie Oct 3, 2024
e6be45f
Upgrading upload-artifact to v4
schuemie Oct 3, 2024
d543f62
Turning CI on Windows and MacOs back on
schuemie Oct 3, 2024
773df98
Attempting to fix Linux crash by upgrading Postgres driver to latest
schuemie Oct 4, 2024
2e84364
Only running Databricks on Windows
schuemie Oct 4, 2024
b84f2d9
Noting reason for disabling Databricks on Linux. Updading checkout ac…
schuemie Oct 4, 2024
9c000da
Postgres bulk load when using connection string parameter
azimov Oct 4, 2024
928ad9d
Update BulkLoad.R
azimov Oct 4, 2024
69f8dbb
require SqlRender 1.19.0 for new IRIS dialect support
bdeboe Oct 9, 2024
46c896b
simplify testing when test servers are undefined
bdeboe Oct 9, 2024
cc2f38d
simplify retrieving JDBC driver for InterSystems IRIS
bdeboe Oct 9, 2024
6009263
initial IRIS setup
bdeboe Oct 10, 2024
211152d
Merge branch 'develop' of github.com:OHDSI/DatabaseConnector
bdeboe Oct 10, 2024
8a6cdbd
use executeBatch() rather than executeLargeBatch() on InterSystems IRIS
bdeboe Oct 10, 2024
026d0b3
add InterSystems IRIS datatypes to insertTable unit test
bdeboe Oct 10, 2024
e2b5986
drop temp table if exists at start of test
bdeboe Oct 10, 2024
2076842
Merge pull request #292 from OHDSI/postgres-bulk-load-connection-string
schuemie Oct 11, 2024
514d535
Using `INSERT` with multiple values in `insertTable()` for DataBrick…
Dec 16, 2024
8acd773
Code cleanup
Dec 17, 2024
7880ca4
reran roxygen
bdeboe Jan 2, 2025
57a2469
reran pkgdown
bdeboe Jan 2, 2025
2ac302c
update IRIS driver, pull from Maven
bdeboe Jan 2, 2025
a75b9c8
Fixing unit test on platforms using temp emulation
schuemie Jan 8, 2025
cbe58a0
Regenerating documentation. Updating copyright year
schuemie Jan 8, 2025
e619b0d
Fixing covr upload in GA
schuemie Jan 8, 2025
231bc49
Fixing links in documentation
schuemie Jan 9, 2025
1a1ee9a
remove statement.close(). The specific code was not previously there …
IoannaNika Jan 10, 2025
dd4785c
build jar file with zulu-8.jdk
IoannaNika Jan 10, 2025
08fbb2a
try to fix r-cmd-check on github
IoannaNika Jan 10, 2025
fe43921
remove print statements in java code
IoannaNika Jan 10, 2025
1abc92a
Updated BigQuery driver to 1.6.2.
schuemie Jan 14, 2025
212e743
copy_inline: wrong translation on redshift
IoannaNika Jan 14, 2025
f157b86
Fewer inserts on BigQuery to avoid rate limit error during testing
schuemie Jan 14, 2025
08f4d70
Manually fix links broken by roxygen
schuemie Jan 15, 2025
3b73903
Code to fix links
schuemie Jan 15, 2025
7b5548f
Trying different way to fix links to classes in DBI
schuemie Jan 15, 2025
ff12112
Updating CRAN submission file
schuemie Jan 16, 2025
25fd106
Adding bulk load for Spark (DataBricks)
anthonysena Jan 22, 2025
79eb5d9
Merge pull request #301 from OHDSI/databricks-bulk-load
schuemie Jan 22, 2025
2ffbd15
merge from OHDSI/main
bdeboe Jan 27, 2025
cd08658
Merge branch 'OHDSI-main'
bdeboe Jan 27, 2025
d6cc165
remove stray test file
bdeboe Jan 27, 2025
7badb96
use SqlRender 1.19.1
bdeboe Jan 27, 2025
052ea0d
Merge pull request #302 from intersystems-community/main
schuemie Jan 29, 2025
aaf0428
Increasing version number
schuemie Jan 29, 2025
168b5db
Regenerating documentation
schuemie Jan 30, 2025
f0d4790
Updating CRAN submission file
schuemie Jan 31, 2025
0794d6e
Remove direct call to `bit64` S3 method to avoid issues in the future…
schuemie Jan 31, 2025
24ce059
Fixed error when calling `getTableNames()` on a `DuckDB` connection. …
schuemie Jan 31, 2025
4128c24
comment out test for dateAdd and eoMonth after discussion with Adam a…
IoannaNika Feb 14, 2025
c229ed0
commenting out datediff tests (not an R function)
IoannaNika Feb 14, 2025
5784f66
merge ohdsi main branch
ablack3 Feb 14, 2025
8eb9cc4
Merge branch 'develop' of github.com:ohdsi/DatabaseConnector into dbp…
ablack3 Feb 14, 2025
e11d4dc
Merge branch 'dbplyr2-inika' of github.com:darwin-eu-dev/DatabaseConn…
ablack3 Feb 14, 2025
b46f60a
oracle: issue in translation with head() and copy_inline()
IoannaNika Feb 14, 2025
6658d54
disable translation
IoannaNika Feb 17, 2025
a37ca81
add sql_query_select to fix limit not being translated correct
IoannaNika Feb 17, 2025
443e4c8
update insertTable tests
IoannaNika Feb 17, 2025
1edf8f8
Merge branch 'dbplyr2-inika' of github.com:darwin-eu-dev/DatabaseConn…
IoannaNika Feb 17, 2025
d1c712b
update documentation and namespace
IoannaNika Feb 17, 2025
610517a
change default behaviour for switch
IoannaNika Feb 17, 2025
97309a9
update tests: after adding sql_query_select in the backend head() in …
IoannaNika Feb 17, 2025
768ce34
enable translation to account for temp emulation
IoannaNika Feb 19, 2025
c920c8e
skip tests on database systems for which translation is not working c…
IoannaNika Feb 19, 2025
34b5134
exempt snowflake from copy_to and copy_inline tests
IoannaNika Feb 19, 2025
47b0f85
specify bigrquery as dependancy
IoannaNika Feb 19, 2025
b43a815
add translation option for iris
IoannaNika Feb 20, 2025
33d6592
skip dplyr tests on iris
ablack3 Feb 20, 2025
c155ab0
fix if statement for iris tests
IoannaNika Feb 21, 2025
9c92066
update checksum
IoannaNika Feb 21, 2025
c91df95
update rd files
IoannaNika Feb 21, 2025
18590c4
update documentation
IoannaNika Feb 21, 2025
1eefe94
update docs
ablack3 Feb 21, 2025
5368140
fix documentation
IoannaNika Feb 21, 2025
9f14880
exempt spark and bigquery from copy_to and copy_inlineo
IoannaNika Feb 21, 2025
e0e4a2b
exempt bigquery from copy_to()
IoannaNika Feb 21, 2025
8f53df1
leave code for deprecated dbms in
ablack3 Mar 3, 2025
3fe4549
remove comment
ablack3 Mar 3, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 26 additions & 9 deletions .github/workflows/R_CMD_check_Hades.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ jobs:
fail-fast: false
matrix:
config:
- {os: windows-latest, r: 'release'}
- {os: macOS-latest, r: 'release'}
- {os: ubuntu-20.04, r: 'release', rspm: "https://packagemanager.rstudio.com/cran/__linux__/focal/latest"}

env:
Expand Down Expand Up @@ -63,7 +65,7 @@ jobs:
CDM_BIG_QUERY_OHDSI_SCHEMA: ${{ secrets.CDM_BIG_QUERY_OHDSI_SCHEMA }}

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Setup Java
if: runner.os != 'Linux'
Expand Down Expand Up @@ -103,22 +105,37 @@ jobs:

- name: Upload source package
if: success() && runner.os == 'macOS' && github.event_name != 'pull_request' && github.ref == 'refs/heads/main'
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: package_tarball
path: check/*.tar.gz

- name: Install covr
if: runner.os == 'macOS'
if: runner.os == 'Linux'
run: |
install.packages("covr")
remotes::install_cran("covr")
remotes::install_cran("xml2")
shell: Rscript {0}

- name: Test coverage
if: runner.os == 'macOS'
run: covr::codecov()
if: runner.os == 'Linux'
run: |
cov <- covr::package_coverage(
quiet = FALSE,
clean = FALSE,
install_path = file.path(normalizePath(Sys.getenv("RUNNER_TEMP"), winslash = "/"), "package")
)
covr::to_cobertura(cov)
shell: Rscript {0}

- uses: codecov/codecov-action@v4
if: runner.os == 'Linux'
with:
file: ./cobertura.xml
plugin: noop
disable_search: true
token: ${{ secrets.CODECOV_TOKEN }}

Release:
needs: R-CMD-Check

Expand All @@ -131,7 +148,7 @@ jobs:

steps:

- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0

Expand Down Expand Up @@ -173,7 +190,7 @@ jobs:

- name: Download package tarball
if: ${{ env.new_version != '' }}
uses: actions/download-artifact@v2
uses: actions/download-artifact@v4.1.7
with:
name: package_tarball

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/R_CMD_check_main_weekly.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ jobs:
CDM5_SPARK_OHDSI_SCHEMA: ${{ secrets.CDM5_SPARK_OHDSI_SCHEMA }}

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4

- name: Setup Java
uses: actions/setup-java@v4
Expand Down
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,6 @@ R/LocalEnvironment.R
/bin/
/DatabaseConnector.iml
.idea/
errorReportSql.txt
errorReportSql.txt
/work/
.settings/org.eclipse.core.resources.prefs
6 changes: 3 additions & 3 deletions CRAN-SUBMISSION
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
Version: 6.3.2
Date: 2023-12-11 13:54:30 UTC
SHA: dd511ec8b23927ffb61a17bedd9ee8bb81cbe476
Version: 6.4.0
Date: 2025-01-30 11:09:57 UTC
SHA: 168b5dbd79128199da65551f84b5d128c1f08d67
15 changes: 9 additions & 6 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
Package: DatabaseConnector
Type: Package
Title: Connecting to Various Database Platforms
Version: 6.3.3.9000
Date: 2024-06-13
Version: 6.4.0
Date: 2025-01-30
Authors@R: c(
person("Martijn", "Schuemie", email = "schuemie@ohdsi.org", role = c("aut", "cre")),
person("Marc", "Suchard", role = c("aut")),
Expand All @@ -13,14 +13,15 @@ Authors@R: c(
person("Amazon Inc.", role = c("cph"), comment = "RedShift JDBC driver")
)
Description: An R 'DataBase Interface' ('DBI') compatible interface to various database platforms ('PostgreSQL', 'Oracle', 'Microsoft SQL Server',
'Amazon Redshift', 'Microsoft Parallel Database Warehouse', 'IBM Netezza', 'Apache Impala', 'Google BigQuery', 'Snowflake', 'Spark', and 'SQLite'). Also includes support for
fetching data as 'Andromeda' objects. Uses either 'Java Database Connectivity' ('JDBC') or other 'DBI' drivers to connect to databases.
'Amazon Redshift', 'Microsoft Parallel Database Warehouse', 'IBM Netezza', 'Apache Impala', 'Google BigQuery', 'Snowflake', 'Spark', 'SQLite',
and 'InterSystems IRIS'). Also includes support for fetching data as 'Andromeda' objects. Uses either 'Java Database Connectivity' ('JDBC') or
other 'DBI' drivers to connect to databases.
SystemRequirements: Java (>= 8)
Depends:
R (>= 4.0.0)
Imports:
rJava,
SqlRender (>= 1.16.0),
SqlRender (>= 1.19.1),
methods,
stringr,
readr,
Expand All @@ -47,8 +48,10 @@ Suggests:
RPostgres,
odbc,
duckdb,
bigrquery,
pool,
ParallelLogger
ParallelLogger,
AzureStor
License: Apache License
VignetteBuilder: knitr
URL: https://ohdsi.github.io/DatabaseConnector/, https://github.com/OHDSI/DatabaseConnector
Expand Down
12 changes: 12 additions & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
@@ -1,10 +1,20 @@
# Generated by roxygen2: do not edit by hand

S3method(compileReconnectCode,DatabaseConnectorDbiConnection)
S3method(compileReconnectCode,default)
S3method(dbplyr_edition,DatabaseConnectorConnection)
S3method(disconnect,DatabaseConnectorDbiConnection)
S3method(disconnect,default)
S3method(getCatalogs,DatabaseConnectorDbiConnection)
S3method(getCatalogs,default)
S3method(getSchemaNames,DatabaseConnectorDbiConnection)
S3method(getSchemaNames,default)
S3method(getServer,DatabaseConnectorDbiConnection)
S3method(getServer,default)
S3method(insertTable,DatabaseConnectorDbiConnection)
S3method(insertTable,default)
S3method(listDatabaseConnectorColumns,DatabaseConnectorDbiConnection)
S3method(listDatabaseConnectorColumns,default)
S3method(lowLevelExecuteSql,DatabaseConnectorDbiConnection)
S3method(lowLevelExecuteSql,default)
S3method(lowLevelQuerySql,DatabaseConnectorDbiConnection)
Expand All @@ -13,6 +23,7 @@ S3method(lowLevelQuerySqlToAndromeda,DatabaseConnectorDbiConnection)
S3method(lowLevelQuerySqlToAndromeda,default)
S3method(renderTranslateQueryApplyBatched,DatabaseConnectorDbiConnection)
S3method(renderTranslateQueryApplyBatched,default)
S3method(sql_query_select,DatabaseConnectorJdbcConnection)
S3method(sql_translation,DatabaseConnectorJdbcConnection)
export(DatabaseConnectorDriver)
export(assertTempEmulationSchemaSet)
Expand Down Expand Up @@ -77,6 +88,7 @@ import(methods)
import(rJava)
importFrom(bit64,integer64)
importFrom(dbplyr,dbplyr_edition)
importFrom(dbplyr,sql_query_select)
importFrom(dbplyr,sql_translation)
importFrom(rlang,abort)
importFrom(rlang,inform)
Expand Down
23 changes: 23 additions & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,22 @@
DatabaseConnector 6.4.1
=======================

Bugfixes:

- Remove direct call to `bit64` S3 method to avoid issues in the future.

- Fixed error when calling `getTableNames()` on a `DuckDB` connection.


DatabaseConnector 6.4.0
=======================

Changes:

- Adding support for InterSystems IRIS.



DatabaseConnector 6.3.3
=======================

Expand All @@ -7,6 +26,10 @@ Changes:

- Updated Databricks driver to 2.6.36.

- Updated BigQuery driver to 1.6.2.

- Using `INSERT` with multiple values in `insertTable()` for DataBricks for faster inserts.


DatabaseConnector 6.3.2
=======================
Expand Down
2 changes: 1 addition & 1 deletion R/Andromeda.R
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright 2023 Observational Health Data Sciences and Informatics
# Copyright 2025 Observational Health Data Sciences and Informatics
#
# This file is part of DatabaseConnector
#
Expand Down
97 changes: 95 additions & 2 deletions R/BulkLoad.R
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# @file BulkLoad.R
#
# Copyright 2023 Observational Health Data Sciences and Informatics
# Copyright 2025 Observational Health Data Sciences and Informatics
#
# This file is part of DatabaseConnector
#
Expand Down Expand Up @@ -62,6 +62,25 @@
return(FALSE)
}
return(TRUE)
} else if (dbms(connection) == "spark") {
envSet <- FALSE
container <- FALSE

Check warning on line 67 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L65-L67

Added lines #L65 - L67 were not covered by tests

if (Sys.getenv("AZR_STORAGE_ACCOUNT") != "" && Sys.getenv("AZR_ACCOUNT_KEY") != "" && Sys.setenv("AZR_CONTAINER_NAME") != "") {
envSet <- TRUE

Check warning on line 70 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L69-L70

Added lines #L69 - L70 were not covered by tests
}

# List storage containers to confirm the container
# specified in the configuration exists
ensure_installed("AzureStor")
azureEndpoint <- getAzureEndpoint()
containerList <- getAzureContainerNames(azureEndpoint)

Check warning on line 77 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L75-L77

Added lines #L75 - L77 were not covered by tests

if (Sys.getenv("AZR_CONTAINER_NAME") %in% containerList) {
container <- TRUE

Check warning on line 80 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L79-L80

Added lines #L79 - L80 were not covered by tests
}

return(envSet & container)

Check warning on line 83 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L83

Added line #L83 was not covered by tests
} else {
return(FALSE)
}
Expand All @@ -72,6 +91,18 @@
return(if (sshUser == "") "root" else sshUser)
}

getAzureEndpoint <- function() {
azureEndpoint <- AzureStor::storage_endpoint(
paste0("https://", Sys.getenv("AZR_STORAGE_ACCOUNT"), ".dfs.core.windows.net"),
key = Sys.getenv("AZR_ACCOUNT_KEY")
)
return(azureEndpoint)

Check warning on line 99 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L95-L99

Added lines #L95 - L99 were not covered by tests
}

getAzureContainerNames <- function(azureEndpoint) {
return(names(AzureStor::list_storage_containers(azureEndpoint)))

Check warning on line 103 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L103

Added line #L103 was not covered by tests
}

countRows <- function(connection, sqlTableName) {
sql <- "SELECT COUNT(*) FROM @table"
count <- renderTranslateQuerySql(
Expand Down Expand Up @@ -299,7 +330,19 @@
readr::write_excel_csv(data, csvFileName, na = "")
on.exit(unlink(csvFileName))

hostServerDb <- strsplit(attr(connection, "server")(), "/")[[1]]
server <- attr(connection, "server")()
if (is.null(server)) {

Check warning on line 334 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L333-L334

Added lines #L333 - L334 were not covered by tests
# taken directly from DatabaseConnector R/RStudio.R - getServer.default, could an attr too?
databaseMetaData <- rJava::.jcall(
connection@jConnection,
"Ljava/sql/DatabaseMetaData;",
"getMetaData"
)
server <- rJava::.jcall(databaseMetaData, "Ljava/lang/String;", "getURL")
server <- strsplit(server, "//")[[1]][2]

Check warning on line 342 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L336-L342

Added lines #L336 - L342 were not covered by tests
}

hostServerDb <- strsplit(server, "/")[[1]]

Check warning on line 345 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L345

Added line #L345 was not covered by tests
port <- attr(connection, "port")()
user <- attr(connection, "user")()
password <- attr(connection, "password")()
Expand Down Expand Up @@ -342,3 +385,53 @@
delta <- Sys.time() - startTime
inform(paste("Bulk load to PostgreSQL took", signif(delta, 3), attr(delta, "units")))
}

bulkLoadSpark <- function(connection, sqlTableName, data) {
ensure_installed("AzureStor")
logTrace(sprintf("Inserting %d rows into table '%s' using DataBricks bulk load", nrow(data), sqlTableName))
start <- Sys.time()

Check warning on line 392 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L390-L392

Added lines #L390 - L392 were not covered by tests

csvFileName <- tempfile("spark_insert_", fileext = ".csv")
write.csv(x = data, na = "", file = csvFileName, row.names = FALSE, quote = TRUE)
on.exit(unlink(csvFileName))

Check warning on line 396 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L394-L396

Added lines #L394 - L396 were not covered by tests

azureEndpoint <- getAzureEndpoint()
containers <- AzureStor::list_storage_containers(azureEndpoint)
targetContainer <- containers[[Sys.getenv("AZR_CONTAINER_NAME")]]
AzureStor::storage_upload(
targetContainer,
src=csvFileName,
dest=csvFileName
)

Check warning on line 405 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L398-L405

Added lines #L398 - L405 were not covered by tests

on.exit(
AzureStor::delete_storage_file(
targetContainer,
file = csvFileName,
confirm = FALSE
),
add = TRUE
)

Check warning on line 414 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L407-L414

Added lines #L407 - L414 were not covered by tests

sql <- SqlRender::loadRenderTranslateSql(
sqlFilename = "sparkCopy.sql",
packageName = "DatabaseConnector",
dbms = "spark",
sqlTableName = sqlTableName,
fileName = basename(csvFileName),
azureAccountKey = Sys.getenv("AZR_ACCOUNT_KEY"),
azureStorageAccount = Sys.getenv("AZR_STORAGE_ACCOUNT")
)

Check warning on line 424 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L416-L424

Added lines #L416 - L424 were not covered by tests

tryCatch(
{
DatabaseConnector::executeSql(connection = connection, sql = sql, reportOverallTime = FALSE)
},
error = function(e) {
abort("Error in DataBricks bulk upload. Please check DataBricks/Azure Storage access.")
}
)
delta <- Sys.time() - start
inform(paste("Bulk load to DataBricks took", signif(delta, 3), attr(delta, "units")))

Check warning on line 435 in R/BulkLoad.R

View check run for this annotation

Codecov / codecov/patch

R/BulkLoad.R#L426-L435

Added lines #L426 - L435 were not covered by tests
}

2 changes: 1 addition & 1 deletion R/Compression.R
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# @file InsertTable.R
#
# Copyright 2023 Observational Health Data Sciences and Informatics
# Copyright 2025 Observational Health Data Sciences and Informatics
#
# This file is part of DatabaseConnector
#
Expand Down
Loading
Loading