Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci-cd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ jobs:
- name: Upload pages artifact
uses: actions/upload-pages-artifact@v4
with:
path: docs
path: demo
- name: Deploy to GitHub Pages
uses: actions/deploy-pages@v5

2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,4 +7,4 @@ node_modules
.DS_Store
docs/jsdoc/
docs/api/
demo/
demo/api
67 changes: 64 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,14 @@
![TypeScript](https://img.shields.io/badge/typescript-%23007ACC.svg?style=for-the-badge&logo=typescript&logoColor=white)

>
Convert CSV files to JSON with **no dependencies**. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance.
Convert CSV files to JSON with **no dependencies**. Supports Node.js (Sync & Async), and Browser environments with full RFC 4180 compliance. **Memory-efficient streaming** for processing large files without loading them entirely into memory.

## Overview

Transform CSV data into JSON with a simple, chainable API. Choose your implementation style:

- **[Synchronous API](docs/SYNC.md)** - Blocking operations for simple workflows
- **[Asynchronous API](docs/ASYNC.md)** - Promise-based for modern async/await patterns
- **[Asynchronous API](docs/ASYNC.md)** - Promise-based for modern async/await patterns with **memory-efficient streaming** for large files
- **[Browser API](docs/BROWSER.md)** - Client-side CSV parsing for web applications

## Demo and JSDoc
Expand All @@ -40,7 +40,7 @@ Transform CSV data into JSON with a simple, chainable API. Choose your implement
✅ **Full TypeScript Support** - Included type definitions for all APIs
✅ **Flexible Configuration** - Custom delimiters, encoding, trimming, and more
✅ **Method Chaining** - Fluent API for readable code
✅ **Large File Support** - Stream processing for memory-efficient handling
✅ **Memory-Efficient Streaming** - Process large files without loading them entirely into memory
✅ **Comprehensive Error Handling** - Detailed, actionable error messages with solutions (see [ERROR_HANDLING.md](docs/ERROR_HANDLING.md))

## RFC 4180 Standard
Expand Down Expand Up @@ -150,6 +150,9 @@ All APIs (Sync, Async and Browser) support the same configuration methods:
- `trimHeaderFieldWhiteSpace(bool)` - Remove spaces from headers
- `parseSubArray(delim, sep)` - Parse delimited arrays
- `mapRows(fn)` - Transform, filter, or enrich each row
- `getJsonFromStreamAsync(stream)` - Process CSV from Readable streams for NodeJS and Browser
- `getJsonFromFileStreamingAsync(filePath)` - Stream processing for large files for NodeJS and Browser
- `getJsonFromFileStreamingAsyncWithCallback(filePath, options = {})` - Parse CSV from a File using streaming with progress callbacks for large files
- `utf8Encoding()`, `latin1Encoding()`, etc. - Set file encoding

### Examples
Expand Down Expand Up @@ -233,6 +236,64 @@ csvToJson.latin1Encoding().getJsonFromCsv('data.csv');
csvToJson.customEncoding('ucs2').getJsonFromCsv('data.csv');
```

#### `getJsonFromStreamAsync(stream)` - Process CSV from Readable streams
```js
const fs = require('fs');
const csvToJson = require('convert-csv-to-json');

// Process large files without loading them entirely into memory
async function processLargeCSV() {
const stream = fs.createReadStream('large-dataset.csv');
const jsonData = await csvToJson
.fieldDelimiter(';')
.supportQuotedField(true)
.getJsonFromStreamAsync(stream);

console.log(`Processed ${jsonData.length} records efficiently`);
return jsonData;
}
```

#### `getJsonFromFileStreamingAsync(filePath)` - Stream processing for large files
```js
const csvToJson = require('convert-csv-to-json');

// Most efficient way to process large CSV files
async function processLargeCSV(filePath) {
const jsonData = await csvToJson
.fieldDelimiter(',')
.formatValueByType()
.getJsonFromFileStreamingAsync(filePath);

console.log(`Streamed and processed ${jsonData.length} records`);
return jsonData;
}

// Usage - handles files of any size without memory constraints
const data = await processLargeCSV('massive-dataset.csv');
```

#### `getJsonFromFileStreamingAsyncWithCallback(filePath, options = {})` - Parse CSV from a File object using streaming with progress callbacks for large files

```js
const csvToJson = require('convert-csv-to-json');
const fileInput = document.querySelector('#csvfile').files[0];

csvToJson.browser.getJsonFromFileStreamingAsyncWithCallback(fileInput, {
chunkSize: 500,
onChunk: (rows, processed, total) => {
console.log(`Processed ${processed}/${total} rows`);
// Handle chunk of rows here
},
onComplete: (allRows) => {
console.log('Processing complete!');
},
onError: (error) => {
console.error('Error:', error);
}
});
```

See [SYNC.md](docs/SYNC.md), [ASYNC.md](docs/ASYNC.md) or [BROWSER.md](docs/BROWSER.md) for complete configuration details.

## Example: Complete Workflow
Expand Down
File renamed without changes.
Loading
Loading