Thursday, October 31, 2013

Blog Reflections #11: Deliverable #3 Feedback

   I feel like our presentation for deliverable #3 went well. Our script ran as planned. We have to make a few changes that I feel we already knew about coming into the presentation. We need to:

  • Not hard code our methods into their specified drivers
  • Add a method and id column to our html reports
  These are easy fixes. We will change these two things this weekend. Once we change this we still need to find 1 to 2 more methods to test. If we finish this, we will be way ahead of our schedule and can start working on our poster. 

Sunday, October 27, 2013

Blog Reflections #9: Successes and Searches...

   My team and I now have our whole framework working with our simple methods. We are still in the process of finding methods that actually relate to the Firefox desktop software. This is the only thing keeping us off our schedule.
   In addition to still trying to find workable methods, we have started on re-working our test plan based on the feedback we received during our last presentation. We also will add our experience report for deliverable #3 in this document. I have many things to report on for this deliverable. It was the most challenging to date.

Thursday, October 24, 2013

Blog Reflections #9: Slowly Putting it all together

    We now have a script that iterates through all our testcase files, one that compares two files (and outputs whether they are the same or different in an html document), and one that extracts the input part of the testcase file in order for us to use this for parameters of the method we will test. We are now in the process of putting all of these scripts together in one coherent script that will become our runAllTests.

   After we get done with this, we will try our script on the actual method we want to use from the Mozilla Firefox code. I hope all goes well and our Toy code is flexible enough for this big change.

Tuesday, October 22, 2013

Blog Reflection #8: Toy project

   For a while I have been trying to isolate a floor mod method I found within the Firefox code, to test in our project. Instead of spending more time on this, I decided to make up a simple add method, get it to compile, and work with that. I planned on getting it to do all the requirements of deliverable 3.

  I worked with Hannah on most of this. We were able to figure how to print the result of the add method to a text file. We then worked on a script to compare this text file to another text file (toy oracle) that just had an example outcome in it. We then were able to get the results of this compare to print in an html document in a browser. This process was very rewarding!



   We now are trying to see how we can add all of the other requirements to our test case file (information such as testcase #, requirement tested, inputs, and outcome), and have our driver read this and grab our inputs from this text.While we are working on this, Matt will stay working on isolating the "real" method we need to work with.

   I am particularly proud of the work of Hannah, Matt, and myself. Unlike the groups who have already finished deliverable 3, we did not have anybody in our group with prior experience with Linux scripting OR c++. At first, I was doubting whether we could overcome this learning curve in time to turn in this deliverable. NOW I think we can if we stay at it. I am learning ALOT while working on this project! :)


Thursday, October 17, 2013

Blog Reflections #7: Scripts, scripts, and more scripts!

   In preparation for deliverable #3, I have been going over the scripting tutorials on this site again. This site went over many scripting functions. I went over test script programs that:

  • Displayed the date
  • Displayed the current user
  • Stated the operating system
  • Utilized the read statement to get input from a user
  • State the current working directory
  • Searched for a specific file within the directory
  • Writing functions in scripting 

Here is a screen grab of one of the scripts:

    I started to write deliverable 3, but ran into some trouble. I hope these tutorials will help me write the testing framework for our project. I am learning alot, but I still want to complete the project.

Thursday, October 10, 2013

Blog Reflections #6 - Digging Deeper (Team Update)

    In the last class, we went over our test plans for our projects. During our presentation, we noticed that we may not have the time to do all the tests we previously wanted to do (for example testing how Firefox handles the stress of operating with many tabs open). We are now in the process of specifying our test plan. 

    We are going to dig deeper into Firefox's source code to find just a few methods. We are then going to come up with different inputs and expected outputs for these specific methods. I think this actually makes things easier for us. I liked the feedback we received. I would rather be told we were being too ambiguous than not having enough to work with. This was encouraging. I hope we are able to understand, and effectively test the method we choose. I also hope we are able to get alot of work done during the upcoming break. 

Tuesday, October 8, 2013

Blog Reflection #6 - Group...Team/IEEE Test Plan

    My group has progressed. When we first started working together we sat separately on our computers and searched anything we could about our project. Now, we are more comfortable with our knowledge of Mozilla Firefox. We are working better together and we finished our test plan very efficiently. One of the main things I like about working with my particular group is the distribution of work. We ALL helped work on our test plan, and we are trusting each other's ideas more. I feel we are more than a group now, we are becoming a team. 

   For our test plan we followed the IEEE standard test plan. Although this was a more difficult approach, Iwefelt like we would need to learn this for when we help make a test plan in the future. We would have to follow IEEE standards, so it would be good to get a head start on following these protocols early. 

    Here is the IEEE test plan example we followed. I felt this was a very clear example to follow: 


    

Thursday, October 3, 2013

Blog Reflections #5: Testing, Testing...

On Tuesday, we had our first test on everything we learned so far in Software Engineering. I have learned many things from this course. Some of the key things I learned include:

The difference between software engineering and computer science:

  • Computer Science focuses on theory and fundamentals, while software engineering is the actual process of developing software.
Sociotechnical Systems:
  • Sociotechnical system are the non-technical elements(such as people, processes, and regualtions, etc.), as well as technical components such as computers, software, and other equipment. 
Wicked Problems
  • A problem that is so complex and involves so many related entities that there is no definitive problem specification. 
Fan-out
  • When a method calls many other methods. 
Re-factoring
  • Continuous improvement; combats the degregation of changes. 
Enthnography
  • An observational technique where the software engineer put his or herself in the shoes of the user. 
Code & Fix
  • I learned this is the worst technique when coding. 
None of these concepts were included on the test. They just stood out to me, and I feel like I should know them if I want a career in software engineering. 

~Team Update~
   My team and I are finishing up the tests we have been running. Firefox has 16 different types of tests so we split this up among the 5 of us. The testing for Firefox is very thorough, so I think it is a good idea for us to see every kind of test in order to fully understand the system. We are starting deliverable #2 today.