Two pure tones with sufficiently different frequencies yield action potentials in separate populations of auditory nerve fibers. Cochlear filtering induces a temporal shift between the firings of these populations, with fibers coding for high frequencies firing earlier than those coding for low frequencies. Despite that, these two sounds are still perceived as synchronous. Studies with normal hearing subjects suggested that the cochlear delay may be compensated at a more central level of the auditory system. The exact mechanisms involved centrally remain, however, unclear and studies in normal-hearing (NH) subjects present limitations such as spectral splatter effects. Since in Cochlear Implant (CI) users, the basilar membrane is bypassed and electrodes directly stimulate the nerve fibers, there is no cochlear delay. A remaining question is whether this cochlear delay is important for sound perception which would argue in favor of introducing it in CI processors. The main goal of this study is to evaluate the sensitivity of CI recipients to time delays between electrodes. A group of MED-EL cochlear implant recipients took part in an electrode delay discrimination experiment. The task was a 3I-2AFC where each interval was composed of two electrical pulses presented on the most apical and most basal electrodes. Discrimination delays were measured adaptively following a 2-down 1-up rule in six conditions differing in electrode order (apical or basal first) and in reference delay (0, 10 or 20 ms). We hypothesized that CI users would discriminate better in cases where the reference delay is comparable to the cochlear delay in NH (about 10 ms). Preliminary results with 5 subjects show that thresholds tend to increase with reference delay, consistent with previous findings. However, there does not seem to be a trend for better threshold when the across-electrode delay mimics what would be obtained in a normal-hearing cochlea.