TypeScript Examples Source
JavaScript Examples Source
Manual Write# TypeScript JavaScript Output Copy import { parse } from '@fast-csv/parse' ;
const stream = parse ( { headers : true } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( 'header1,header2\n' ) ;
stream . write ( 'col1,col2' ) ;
stream . end ( ) ;
Alternate Delimiter# You can provide a delimiter
option to change the delimiter from a ,
character.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'a1\tb1' ,
'a2\tb2'
] . join ( EOL ) ;
const stream = parse ( { delimiter : '\t' } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Headers# First Row As Headers# If you expect the first line your CSV to be headers you may pass in a headers
option.
Setting the headers
option to true
will cause change each row to an object rather than an array.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'a,b' ,
'a1,b1' ,
'a2,b2'
] . join ( EOL ) ;
const stream = parse ( { headers : true } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Custom Headers# You may alternatively pass an array of header names.
The order of the headers array will should match the order of fields in the CSV, otherwise the data columns will
not match.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'a1,b1' ,
'a2,b2'
] . join ( EOL ) ;
const stream = parse ( { headers : [ 'a' , 'b' ] } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Renaming Headers# If the CSV contains a header row but you want to provide custom headers you can pass an array of headers, and set renameHeaders
to true.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'header1,header2' ,
'a1,b1' ,
'a2,b2'
] . join ( EOL ) ;
const stream = parse ( { headers : [ 'a' , 'b' ] , renameHeaders : true } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Transforming Headers# If the CSV contains a header row but you want transform the headers you can provide a function to the headers
option.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'header1,header2' ,
'a1,b1' ,
'a2,b2'
] . join ( EOL ) ;
const stream = parse ( {
headers : headers => headers . map ( h => h ? . toUpperCase ( ) ) ,
} )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Skipping Columns# To omit some of the data columns you may not need, pass a sparse array as headers
.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'a1,b1,c1' ,
'a2,b2,c2'
] . join ( EOL ) ;
const stream = parse ( { headers : [ 'a' , undefined , 'c' ] } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Ignoring Empty Rows# If your data includes empty rows, the sort Excel might include at the end of the file for instance, you can ignore these by including the ignoreEmpty
option.
Any rows consisting of nothing but empty strings and/or commas will be skipped, without emitting a 'data', 'data-invalid', or 'error' event.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'a1,b1' ,
',' ,
'a2,b2' ,
' ,\t' ,
'' ,
] . join ( EOL ) ;
const stream = parse ( { ignoreEmpty : true } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Transforming# You can transform data by providing a transform function. What is returned from the transform function will be provided to validate and emitted as a row.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'firstName,lastName' ,
'bob,yukon' ,
'sally,yukon' ,
'timmy,yukon' ,
] . join ( EOL ) ;
type UserRow = {
firstName : string ;
lastName : string ;
} ;
type TransformedUserRow = UserRow & {
properName : string ;
} ;
const stream = parse < UserRow , TransformedUserRow > ( { headers : true } )
. transform (
( data : UserRow ) : TransformedUserRow => ( {
firstName : data . firstName . toUpperCase ( ) ,
lastName : data . lastName . toUpperCase ( ) ,
properName : ` ${ data . firstName } ${ data . lastName } ` ,
} ) ,
)
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , ( row : TransformedUserRow ) => console . log ( JSON . stringify ( row ) ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
fast-csv
also supports async transformation with a callback.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'firstName,lastName' ,
'bob,yukon' ,
'sally,yukon' ,
'timmy,yukon' ,
] . join ( EOL ) ;
type UserRow = {
firstName : string ;
lastName : string ;
} ;
type TransformedUserRow = UserRow & {
properName : string ;
} ;
const stream = parse < UserRow , TransformedUserRow > ( { headers : true } )
. transform ( ( data , cb ) : void => {
setImmediate ( ( ) =>
cb ( null , {
firstName : data . firstName . toUpperCase ( ) ,
lastName : data . lastName . toUpperCase ( ) ,
properName : ` ${ data . firstName } ${ data . lastName } ` ,
} ) ,
) ;
} )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( JSON . stringify ( row ) ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Validation# You can validate each row in the CSV by providing a validate handler. If a row is invalid then a data-invalid
event will be emitted with the row and the index.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'firstName,lastName' ,
'bob,yukon' ,
'sally,yukon' ,
'timmy,yukon' ,
] . join ( EOL ) ;
type UserRow = {
firstName : string ;
lastName : string ;
} ;
const stream = parse < UserRow , UserRow > ( { headers : true } )
. validate ( ( data : UserRow ) : boolean => data . firstName !== 'bob' )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , ( row : UserRow ) => console . log ( ` Valid [row= ${ JSON . stringify ( row ) } ] ` ) )
. on ( 'data-invalid' , ( row , rowNumber ) =>
console . log ( ` Invalid [rowNumber= ${ rowNumber } ] [row= ${ JSON . stringify ( row ) } ] ` ) ,
)
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
fast-csv
also supports async validation, with a callback.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'firstName,lastName' ,
'bob,yukon' ,
'sally,yukon' ,
'timmy,yukon' ,
] . join ( EOL ) ;
type UserRow = {
firstName : string ;
lastName : string ;
} ;
const stream = parse < UserRow , UserRow > ( { headers : true } )
. validate ( ( row , cb ) : void => {
setImmediate ( ( ) => cb ( null , row . firstName !== 'bob' ) ) ;
} )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( ` Valid [row= ${ JSON . stringify ( row ) } ] ` ) )
. on ( 'data-invalid' , row => console . log ( ` Invalid [row= ${ JSON . stringify ( row ) } ] ` ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Sometimes you may wish to provide a reason that the row was invalid, you can use the callback to provide additional info.
TypeScript JavaScript Output Copy import { EOL } from 'os' ;
import { parse } from '@fast-csv/parse' ;
const CSV_STRING = [
'firstName,lastName' ,
'bob,yukon' ,
'sally,yukon' ,
'timmy,yukon' ,
] . join ( EOL ) ;
type UserRow = {
firstName : string ;
lastName : string ;
} ;
const stream = parse < UserRow , UserRow > ( { headers : true } )
. validate ( ( row , cb ) : void => {
const isValid = row . firstName !== 'bob' ;
if ( ! isValid ) {
return cb ( null , false , 'Name is bob' ) ;
}
return cb ( null , true ) ;
} )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( ` Valid [row= ${ JSON . stringify ( row ) } ] ` ) )
. on ( 'data-invalid' , ( row , rowNumber , reason ) => {
console . log ( ` Invalid [rowNumber= ${ rowNumber } ] [row= ${ JSON . stringify ( row ) } ] [reason= ${ reason } ] ` ) ;
} )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
stream . write ( CSV_STRING ) ;
stream . end ( ) ;
Max Rows# In the following example there are 10 rows, but only 5 will be parsed because of the maxRows
option.
TypeScript JavaScript Output Copy import { parse } from '@fast-csv/parse' ;
const rows = [
'header1,header2\n' ,
'col1,col1\n' ,
'col2,col2\n' ,
'col3,col3\n' ,
'col4,col4\n' ,
'col5,col5\n' ,
'col6,col6\n' ,
'col7,col7\n' ,
'col8,col8\n' ,
'col9,col9\n' ,
'col10,col10' ,
] ;
const stream = parse ( { headers : true , maxRows : 5 } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
rows . forEach ( row => stream . write ( row ) ) ;
stream . end ( ) ;
Skip Rows# In the following example the first 2 rows are skipped.
Notice how the headers are not skipped, only the rows.
TypeScript JavaScript Output Copy import { parse } from '@fast-csv/parse' ;
const rows = [
'header1,header2\n' ,
'col1,col1\n' ,
'col2,col2\n' ,
'col3,col3\n' ,
'col4,col4\n' ,
'col5,col5\n' ,
'col6,col6\n' ,
] ;
const stream = parse ( { headers : true , skipRows : 2 } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
rows . forEach ( row => stream . write ( row ) ) ;
stream . end ( ) ;
Skip Lines# In the following example the first 2 lines are skipped.
Notice how the headers come from the third line because the first two are skipped.
TypeScript JavaScript Output Copy import { parse } from '@fast-csv/parse' ;
const rows = [
'skip1_header1,skip1_header2\n' ,
'skip2_header1,skip2_header2\n' ,
'header1,header2\n' ,
'col1,col1\n' ,
'col2,col2\n' ,
'col3,col3\n' ,
'col4,col4\n' ,
] ;
const stream = parse ( { headers : true , skipLines : 2 } )
. on ( 'error' , error => console . error ( error ) )
. on ( 'data' , row => console . log ( row ) )
. on ( 'end' , ( rowCount : number ) => console . log ( ` Parsed ${ rowCount } rows ` ) ) ;
rows . forEach ( row => stream . write ( row ) ) ;
stream . end ( ) ;