Understanding gravitational wave emission from core-collapse supernovae will be essential for their detection with current and future gravitational wave detectors. This requires a sample of waveforms from modern 3D supernova simulations reaching well into the explosion phase, where gravitational wave emission is expected to peak. However, recent waveforms from 3D simulations with multigroup neutrino transport do not reach far into the explosion phase, and some are still obtained from non-exploding models. We therefore calculate waveforms up to 0.9 s after bounce using the neutrino hydrodynamics code COCONUT-FMT. We consider two models with low and normal explosion energy, namely explosions of an ultra-stripped progenitor with an initial helium star mass of 3.5M⊙, and of an 18M⊙ single star. Both models show gravitational wave emission from the excitation of surface g modes in the proto-neutron star with frequencies between ∼800 and 1000 Hz at peak emission. The peak amplitudes are about 6 and 10cm, respectively, which is somewhat higher than in most recent 3D models of the pre-explosion or early explosion phase. Using a Bayesian analysis, we determine the maximum detection distances for our models in simulated Advanced LIGO, Advanced Virgo, and Einstein Telescope (ET) design sensitivity noise. The more energetic 18M⊙ explosion will be detectable to about 17.5kpc by the LIGO/Virgo network and to about 180kpc with the ET.