Monday, September 2, 2013

Reading Response #2

The theme of this week's reading is software failures. There were many aspects of the failures that we read about that surprised me. The main things that caught my attention were: the apathy after the software failure occurred the costs of the software, and the lack of proper testing.
One of these things was the reactions to the software failure by the people who played an important role in the function of the software. When the New York Times’ article about radiation overdoses stated that Alabama officials said that there was "no such thing as overdoses", I was astonished. I always thought that when the lives of others were on the line that the people that could help would react in a more urgent matter. I thought that they would try to fix the problem as fast as they could instead of avoiding the problem by stating there wasn't one. Some even denied their responsibly. An example of this was when the GE spokesman stated that the scanners were "programmed by the user not the manufacturer." I feel even if the user does make a mistake some of the responsibility falls in the software engineer’s lap. This is because in good software there are hazard avoidance attributes that can prevent human error from happening.
Another common characteristic I saw in these articles that surprised me was the amount of money used in the construction of these software projects. The Mars Climate Orbiter cost $193 million dollars, and the duration of the project was about 280 days because of a software failure. The FBI’s Sentinel project has a $451 million dollar budget. This shocked me. I tend to associate large amounts of money with good quality. However, while reading these articles proved to me that this is not the case.
The last re-occurring problem that I saw in all the articles was the lack of in depth testing before putting the finished product out to the public. This can be seen in the Therac-25 incidents. There was a disconnection between the operator interface of the program and the setup of the program. This problem could have easily been avoided if proper testing was practiced.  Another assumption I made was about software that involved medical devices was that they were almost full-proof and well-tested. This was another association that was proven to be incorrect.

Because software engineering has such a wide appeal and is integrated in so many facets of life it is more likely that there can be failures. Nevertheless, I think this face should empower software engineers to make better software because they already know its effect will have a huge impact. I liked reading these articles so I can see what issues may occur when I am creating software in the near future, and I can avoid these problems. Even though a lot of the things I read surprised me, it was interesting to see how anyone can make mistakes at any level of software engineering. This was a scary, yet reassuring detail. 

No comments:

Post a Comment