tl;dr: Removing dead code by logging timestamps. Github link.

Any big code base has a lot of dead code. It’s not that the code base isn’t well tested or well documented, but it still ends up with code that won’t be used anymore. Most of the times the reason behind the dead code is because the developer will not be sure what’s the purpose of the function, and what happens if she removes the function. Irrespective of how modular your codebase is, any iterative development results in functions being modified or ditched for a better function. I do that a lot and I face a lot of problems while cleaning up my code. It almost makes me feel like Dee Dee in Dexter’s lab saying, “Oh! What happens if I remove this function?”

I was glad that I wasn’t the only one who faces it. In one of the Oreilly talks(I’m sorry, I’m not able to find the link), a Box developer spoke about how they dealt with a similar situation. They assigned timestamps to each function. And logged every single time the function was called. This generated a log of every function that has been accessed, its last date of access and so forth. It gave a clearer picture of which functions were being used most, which were used infrequently and which ones weren’t being used at all. Dead code found. This was such a simple, robust and clean approach that I decided to implement the same in our codebase. But I improved it by a tiny bit and also calculated execution time of each function. Because, why not?

So, introducing tombstone.py, which does exactly that, though it is very much in its foetus state and would take a bit of time to improve it, and make it production ready.

import tombstone import datetime class HelloWorld: def __init__(self): pass #time isn't a necessary argument. If not provided, it takes the current time. # module name needs to be there @tombstone.logs(module_name="HelloWorld", time=datetime.datetime.now()) def test_it(self): print "dance basanti" hel = HelloWorld() hel.test_it()

That’s it. It would store access logs everytime the function gets called and an average execution time of the function. The data is currently being stored in Redis. I would like to add support for other databases and even text files. To retrieve the data, it has a very simple api.

from tombstone import Tomb print Tomb.get_data() # gets json data which contains module name, # function name, average execution time, # usage count and last usage date time string #Clear data for the function Tomb.remove_data("module_name:function_name") # Clear all data Tomb.remove_all_data()

That’s about it. I thought of building a dashboard for better visibility, but sometimes the overhead like backgrid or dynatable seems a bit too much for an app like this. But yes, I would consider building a dashboard if the need comes to that. As of now, I believe JSON data would be enough. You can plug it into your modules and display it any way you want to.