Assuming you have all_dataset
variable of tf.data.Dataset
type:
test_dataset = all_dataset.take(1000)
train_dataset = all_dataset.skip(1000)
Test dataset now has first 1000 elements and the rest goes for training.
More Related Contents:
- Meaning of buffer_size in Dataset.map , Dataset.prefetch and Dataset.shuffle
- How do I split Tensorflow datasets?
- How to extract data/labels back from TensorFlow dataset
- tf.data with multiple inputs / outputs in Keras
- parallelising tf.data.Dataset.from_generator
- Parallelism isn’t reducing the time in dataset map
- TensorFlow and person recognition in video stream
- How to understand static shape and dynamic shape in TensorFlow?
- How to Properly Combine TensorFlow’s Dataset API and Keras?
- How Can I Define Only the Gradient for a Tensorflow Subgraph?
- Numpy to TFrecords: Is there a more simple way to handle batch inputs from tfrecords?
- Get the value of some weights in a model trained by TensorFlow
- How do I disable TensorFlow’s eager execution?
- Confused by the behavior of `tf.cond`
- Does model.compile() initialize all the weights and biases in Keras (tensorflow backend)?
- Tensorflow doesn’t seem to see my gpu
- Initializing tensorflow Variable with an array larger than 2GB
- TensorFlow: Dst tensor is not initialized
- Using Gensim Fasttext model with LSTM nn in keras
- Is there a simpler way to handle batch inputs from tfrecords?
- What is the proper way to install TensorFlow on Apple M1 in 2022
- TensorFlow: training on my own image
- Split .tfrecords file into many .tfrecords files
- Simple way to visualize a TensorFlow graph in Jupyter?
- How do I convert a directory of jpeg images to TFRecords file in tensorflow?
- Tensorflow get all variables in scope
- In Keras, what exactly am I configuring when I create a stateful `LSTM` layer with N `units`?
- What is the difference between keras and tf.keras?
- feed data into a tf.contrib.data.Dataset like a queue
- how does tensorflow indexing work