Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the challenge of modern software development. When I first started my career, I spent two months looking for a random crashing bug in a video game due to misplaced free(); this was in code that was 100% ours. Now, there likely isn't an application in existence that does anything interesting where 100% of the code was written for the app itself. As such, you are no longer debugging your own code, but other people's as well.

Vernor Vinge coined the idea of a software archeologist, people who could sift through layers of code, all the way back to the beginning of time (Jan 1, 1970 and Unix, of course), to understand how systems work and to make modifications to them. We aren't that far from that point; already, there are people who seem to specializing in digital spelunking to find and fix bugs in these underlying layers.

While I love building new code, some of the most satisfying moments in my career have been when I've gone back through somebody else's code, untied it (oh, you built a polymorphic type system including class initializers and destructors in a decidedly not object oriented language? Cool.) and fixed it. There's something interesting about getting into somebody else's head and seeing how they approached the problem, then finding out where they were wrong.



Interestingly, in my first job all of the code I worked on my first 3 years was all "in-house" code (because from the bootloader to my code was all proprietary). I think places like Microsoft also have groups that have the same phenomenon. However - the people who initially wrote or worked on that code were long gone.

I've gone on to work other places where I had to do more of this archeology.. and I have to say it actually felt similar.

In summary - I think there is actually a new, more combinatorially complex amount of archeology occurring now. Where the microsoft, apple, netapp, linux, emc, vxworks, etc OS people have been dealing with some of this for a while with one OS... people who rely on services, on many processes, on an internet of things or whatever..

It feels like we'll never have a POSIX of the internet. HTTP is as close as we've gotten, and it's too small to be read/write/exec. We'll never have anything you can "trust" and more and more developers need the patience to wade through everyone else's code as well.


I love the idea of software archaeology in the context of video game software. Usually video game platforms (the earlier the better) cause developers do resort to wacky/unique tricks to make the most of their limited resources. And I feel like that ends up giving video games most of their unique flavor.


I chatted with Wired about just this concept: http://www.wired.com/2014/08/facebook_bug/


No offense to you but that article is very poorly written. It seems like someone abducted the author mid-article and pressed the publish button :)


There are some technical inaccuracies that make it hard to look past as a coder (mixing up POSIX and UNIX, talking about unexpected behavior as a bug). However, I think the author did an accurate job on the high-level tenor of the article and touched on a critical meta-point about this bug.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: