Time-interleaved analog-to-digital converters (TI-ADCs) are widely used for multi-Gigabit orthogonal frequency division multiplexing (OFDM) systems because of their attractive high sampling rate and high resolution. However, in practice, offset mismatch, one of the major mismatches of TI-ADCs, can occur between the parallel sub-ADCs. In this poster session, we theoretically analyze the BER performance of high-speed OFDM systems using TI-ADCs with offset mismatch. Gray-coded PAM or QAM signaling over an additive white Gaussian noise channel is considered. Our numerical results show that the obtained theoretical BER expressions are in excellent agreement with the simulated BER performance.