Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

One possible method for training one neural network model on multiple sets of time series data in Tensorflow without concatenation is to use the tf.data API to create a dataset that can dynamically load and preprocess each time series data set separately.

Here are the basic steps for implementing this method:

  1. Define a function to load and preprocess each time series data set. This function should take a file path as input and return a tf.data.Dataset object representing the time series data.

  2. Use the tf.data.Dataset.from_generator() method to create a dataset that generates file paths for each time series data set.

  3. Use the tf.data.Dataset.map() method to apply the loading and preprocessing function to each file path in the dataset.

  4. Use the tf.data.Dataset.interleave() method to combine the preprocessed datasets into a single dataset that interleaves the data samples from each data set.

  5. Use the resulting dataset as input to the neural network model.

Here is an example code snippet that demonstrates this method:

def load_and_preprocess_data(file_path):
    # Load and preprocess time series data from file_path
    ...

# Create a dataset of file paths for each time series data set
file_paths_dataset = tf.data.Dataset.list_files('/path/to/data_set/*.csv')

# Map the loading and preprocessing function to each file path
preprocessed_datasets = file_paths_dataset.map(load_and_preprocess_data)

# Interleave the preprocessed datasets into a single dataset
interleaved_dataset = preprocessed_datasets.interleave(
    lambda x: x, cycle_length=len(preprocessed_datasets))

# Define and compile the neural network model
model = ...

model.compile(...)

# Train the model on the interleaved dataset
model.fit(interleaved_dataset, ...)