Categories
csv pandas python

Sequentially read huge CSV file in python

I have a 10gb CSV file that contains some information that I need to use.

As I have limited memory on my PC, I can not read all the file in memory in one single batch. Instead, I would like to iteratively read only some rows of this file.

Say that at the first iteration I want to read the first 100, at the second those going to 101 to 200 and so on.

Is there an efficient way to perform this task in Python?
May Pandas provide something useful to this? Or are there better (in terms of memory and speed) methods?