Bubble column reactors have been modified to improve their performance in order to increase the interfacial area concentrations and mass transfer rates. With these modifications, bubble sizes have been reduced and void fraction has been increased; however, the hydrodynamics of these reactors have become more complex. Consequently, the measurement techniques used for the design and scale-up of these different types of reactors have to be carefully examined because each technique has its own advantages, disadvantages and limitations. The most important and challenging step in designing a two-phase reactor is the measurement of void fraction, which would determine the performance of the reactor. There are many correlations for void fraction prediction but their accuracy ultimately depends on the reliability of the data obtained using practical measurement techniques. The present work first reviews several such void fraction measurement techniques and then compares the data obtained under different operational conditions of bubble column reactors with five different techniques: gamma ray densitometry, electrical resistance tomography, wire mesh sensor, optical void probe, and pressure transducers.