Superluminous supernovae (SLSNe) are a distinct class of stellar explosions, exhibiting peak luminosities 10–100 times brighter than those of normal SNe. Their extreme luminosities cannot be explained by the radioactive decay of 56Ni and its daughter 56Co alone. Consequently, models invoking newly formed millisecond magnetars have been widely proposed, capable of supplying additional energy through magnetic dipole radiation. For these rapidly rotating magnetars, however, gravitational-wave (GW) emission may also significantly contribute to the spin-down, particularly during their early evolutionary stages. While high-energy photons initially remain trapped within the optically thick ejecta, they will eventually escape as the ejecta becomes transparent during the expansion, thereby influencing the late-time lightcurve. In this work, we adopt an analytical framework to systematically explore the combined effects of GW emission and high-energy leakage on the lightcurve of SLSNe. Compared to scenarios that neglect these processes, we find that for magnetars with initial spin periods of millisecond, the combined influence suppresses early-time luminosities but enhances late-time emission. We further investigate the effects of the neutron-star equation of state to the lightcurve, GW emission efficiency, ejecta mass, and other relevant quantities. Our results highlight the complex interplay between GW-driven spin-down and radiative transport in shaping the observable features of SLSNe, offering new insights into diagnosing the nature of their central engines.