FarseerClientData
The data module provides Arquero-based operations for loading, transforming, and saving data. Accessed via client.data.
Dimension Table Operations
loadFarseerDimensionTable(name)
Loads a dimension table as an Arquero table.
loadFarseerDimensionTable(name: string): Promise<TableAndMetadata>
| Parameter | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Dimension table name |
Returns: { table: aq.internal.ColumnTable, metadata: object }
const { table, metadata } = await client.data.loadFarseerDimensionTable('Products');
table.print();
// | Name | Description | Category |
// |-----------|-------------|---------------|
// | Product A | Desc A | [Electronics] |
// | Product B | Desc B | [Furniture] |
// Filter (note: connections are arrays, use [0])
const electronics = table.filter(d => d?.['Category'][0] === 'Electronics');
Dimension table connections (foreign keys) are stored as arrays. Always access the first element with [0] when filtering.
// Wrong
table.filter(d => d?.['Category'] === 'Electronics');
// Correct
table.filter(d => d?.['Category'][0] === 'Electronics');
loadFarseerDimensionTableBatched(name, batchSize?)
Loads a large dimension table in batches using an async generator.
loadFarseerDimensionTableBatched(
name: string,
batchSize?: number
): AsyncGenerator<TableAndMetadata>
for await (const { table } of client.data.loadFarseerDimensionTableBatched('LargeTable', 1000)) {
console.log(`Loaded batch with ${table.numRows()} rows`);
// Process each batch
}
Variable Export
loadFarseerVariable(name, ...dimensions)
Exports a variable as an Arquero table with specified dimensions.
loadFarseerVariable(
name: string,
...dimensions: string[]
): Promise<ExportedTableWithConfig>
const result = await client.data.loadFarseerVariable(
'Revenue',
'Products',
'Years',
'Months'
);
result.table.print();
File Operations
loadXlsxFile(fileItem) / loadXlsxFile(buffer)
Loads an Excel file as an Arquero table. Accepts either a Farseer file item or a raw Buffer.
// Overload 1: From Farseer file
loadXlsxFile(fileItem: FolderItemRepresentation): Promise<aq.internal.ColumnTable>
// Overload 2: From Buffer
loadXlsxFile(buffer: Buffer): Promise<aq.internal.ColumnTable>
From Farseer file:
const fileItem = await client.getItemByPath(['Finance', 'data.xlsx']);
const table = await client.data.loadXlsxFile(fileItem);
table.print();
From Buffer:
import * as fs from 'fs';
const buffer = fs.readFileSync('/path/to/local/data.xlsx');
const table = await client.data.loadXlsxFile(buffer);
table.print();
saveToCsvFarseerFile(table, fileName, folderId) / saveToCsvFarseerFile(table, fileItem)
Saves an Arquero table as a CSV file in Farseer. Accepts either a file name + folder ID or an existing folder item.
// Overload 1: Create new file by name
saveToCsvFarseerFile(
table: aq.internal.ColumnTable,
fileName: string,
folderId: number
): Promise<void>
// Overload 2: Update existing file
saveToCsvFarseerFile(
table: aq.internal.ColumnTable,
fileItem: FolderItemRepresentation
): Promise<void>
By file name:
import * as aq from 'arquero';
const data = aq.table({ Name: ['A', 'B'], Value: [100, 200] });
const folder = await client.getItemByPath(['Reports']);
await client.data.saveToCsvFarseerFile(data, 'output.csv', folder.id);
By existing file item (update):
const existingFile = await client.getFolderItem('Reports', 'output.csv');
await client.data.saveToCsvFarseerFile(data, existingFile);
Import Operations
importDataTable(config)
High-level import method that combines an Arquero table with import job creation. Supports the same two styles as createImportJob(): simplified columns or full metadata.
// Overload 1: With simplified columns (recommended)
importDataTable(config: {
data: aq.internal.ColumnTable;
title: string;
columns: ImportMetadataColumn[];
description?: string;
labels?: string[];
}): Promise<ImportActions>
// Overload 2: With full metadata
importDataTable(config: {
data: aq.internal.ColumnTable;
title: string;
metadata: ImportJobMetadata;
description?: string;
labels?: string[];
}): Promise<ImportActions>
The method automatically extracts rows from the Arquero table, calls addRows() and flushRows(). You only need to call undoPrevious() and commit() on the returned ImportActions:
const importJob = await client.data.importDataTable({
data: myArqueroTable,
title: 'Revenue Import',
columns: [
{ type: 'DIMENSION_TABLE', dimensionTableName: 'Products' },
{ type: 'VARIABLE', variableName: 'Revenue' },
],
labels: ['auto', 'revenue'],
});
// Rows are already flushed — just undo previous and commit
await importJob.undoPrevious();
await importJob.commit();
Database Operations
mssqlQuery(connectionPool, query)
Executes a SQL query against Microsoft SQL Server and returns an Arquero table.
mssqlQuery(
connectionPool: mssql.ConnectionPool,
query: string
): Promise<aq.internal.ColumnTable>
import * as mssql from 'mssql';
const pool = await mssql.connect({
server: 'localhost',
database: 'mydb',
user: 'user',
password: 'password',
options: { encrypt: false }
});
const data = await client.data.mssqlQuery(pool, 'SELECT * FROM Products');
data.print();
syncMultipleTables(tables)
Syncs multiple dimension tables from external data sources. Wraps model.load() with a simpler interface.
syncMultipleTables(tables: Array<{
name: string;
columns: ModelTableColumnRepresentation[];
rows: aq.internal.ColumnTable;
}>): Promise<void>
| Field | Type | Description |
|---|---|---|
name | string | Dimension table name in Farseer |
columns | ModelTableColumnRepresentation[] | Column definitions with types |
rows | aq.internal.ColumnTable | Arquero table with the data |
await client.data.syncMultipleTables([
{
name: 'Products',
columns: [
{ name: 'Name', type: 'PrimaryKey' },
{ name: 'Category', type: 'ForeignKey', foreignKeyTableName: 'Categories' }
],
rows: productTable // Arquero table
},
{
name: 'Categories',
columns: [
{ name: 'Name', type: 'PrimaryKey' },
{ name: 'Description', type: 'Description' }
],
rows: categoryTable
}
]);
The property for the data is rows, not data. Pass an Arquero ColumnTable — the method internally converts it to row arrays and handles connection array joining.