Transparency in moderation practices is crucial to the success of an online community. To meet the growing demands of transparency and accountability, several academics came together and proposed the Santa Clara Principles on Transparency and Accountability in Content Moderation (SCP). In 2018, Reddit, home to uniquely moderated communities called subreddits, announced in its transparency report that the company is aligning its content moderation practices with the SCP. But do the moderators of subreddit communities follow these guidelines too? In this paper, we answer this question by employing a mixed-methods approach on public moderation logs collected from 204 subreddits over a period of five months, containing more than 0.5M instances of removals by both human moderators and AutoModerator. Our results reveal a lack of transparency in moderation practices. We find that while subreddits often rely on AutoModerator to sanction newcomer posts based on karma requirements and moderate uncivil content based on automated keyword lists, users are neither notified of these sanctions, nor are these practices formally stated in any of the subreddits' rules. We interviewed 13 Reddit moderators to hear their views on different facets of transparency and to determine why a lack of transparency is a widespread phenomenon. The interviews reveal that moderators' stance on transparency is divided, there is a lack of standardized process to appeal against content removal and Reddit's app and platform design often impede moderators' ability to be transparent in their moderation practices.