Technical Review
This week I will do a technical review of the hale-aloha-cli-jz software system made by some classmates. The project is a command line interface that displays various energy and power usage data for a group of buildings. For those of you who read last week's blog, you'll notice that it has the same functionality as the system my group created. If you haven't been keeping up, do not fear: I will do a recap of all of the technologies used in the project.
First, the purpose and function of the software system. This project creates an interface to the WattDepot service. WattDepot can collect and store power and energy data from any electrical meter. In this case, we are using data from the meters within the Hale Aloha residential towers at the University of Hawaii-Manoa. Data from each meter can be aggregated into virtual sources, so data can be reported for an entire tower at once even though many meters are used to collect the data. Four commands are built into hale-aloha-cli-jz: current-power, daily-energy, energy-since, and rank-towers. These four commands take a combination of source name and dates as arguments and display the requested data to the user. The system is meant to be easily extendable, so new commands can be added as the need arises.
I will be reviewing this system in regard to the Three Prime Directives of Open Source Software. The three directives are that the software 1) accomplishes a useful task, 2) can successfully be installed and used by a new user, and 3) can successfully be understood and enhanced by a new developer. Throughout the review, I will also pay attention to the use of some software engineering concepts. The first is version control. The code should be in svn, which lets each group member access the code at the same time without worrying about saving over each other's work. It also keeps a complete history of the code in case there is a need to revert to an earlier version. The second concept is Issue-Drive Project Management, in which all work on the system is planned and documented with the Issues feature of Google Project Hosting. I will ensure that issues were small enough that they wouldn't take more than two days to complete, and that the majority of code commits were linked to an issue. Next, I will check Continuous Integration (CI) using Jenkins. The purpose of CI is to ensure that the code base is always in a consistent and working state. Each time code is committed, Jenkins should run all of the automatic and manual tests on the project. If anything goes wrong, the group should have received an email about the problem and fixed it as quickly as possible. I will also check documentation (within the code, in the svn commits, and on the wiki), adherence to code standards, and the quality of JUnit testing.
So without further ado, let's get started.
1. Does the system accomplish a useful task?
I've downloaded and ran hale-aloha-cli-jz. I tested each of the four commands (current-power, daily-energy, energy-since, and rank-towers) along with "help" and "quit." All of the functionality was included and everything responded appropriately. This system does seem useful.
2. Can an external user successfully install and use the program?
For this question, I looked through the documentation on the wiki to see how easy it would be for a new user to learn this system. The User Guide is complete, but I would recommend moving the "Requirements" "Download the Program" "Installing the Program" and "Running the Program" sections above "Basic Use." I do not believe Ant is required to use the program, so that should be removed. The install section says "Set up your environment JAVA_HOME, path, classpath (on Windows)which could be very confusing to a non-technical user. The User Guide might also benefit from an example.
The process for downloading and running the program is very easy since the .jar is included in the available .zip file. I find the system very easy to use because the prompt is clean, mostly validates user input, and has consistent error messages for each command. The only thing I could find that is not checked is whether the date arguments are in the future and whether a range of dates is valid. This is something that could easily be checked by the system and produce a better error message.
cli > energy-since Ilima 2011-12-01
Could not gather needed data.
Reason: org.wattdepot.client.BadXmlException: 400: Range extends beyond sensor data, startTime 2011-12-01T00:00:00.000-10:00, endTime 2011-11-30T16:40:17.921-10:00: Request: GET http://server.wattdepot.org:8190/wattdepot/sources/Ilima/energy/?startTime=2011-12-01T00:00:00-10:00&endTime=2011-11-30T16:40:17.921-10:00
cli > rank-towers 2011-11-27 2011-11-26
For the interval 2011-11-27 to 2011-11-26 energy consumption by tower was:
Ilima 0.0 kWh
Lehua 0.0 kWh
Lokelani 0.0 kWh
Mokihana 0.0 kWh
Also, some commands accept the current date as an argument while others do not. This should either be made consistent or produce an appropriate error message. Overall, a few small enhancements could improve the usability of the system but it is already easy to install and use.
3: Can an external developer successfully understand and enhance the system?
The Developer Guide seems comprehensive. I like that it shows how to run each QA tool and view the reports. One thing that could be expanded on is the use of Issue-Driven Project Management. The wiki instructs developers to create issues for significant code changes, but doesn't define significant and doesn't give any hint as to the preferred scope of an issue. It also doesn't ask developers to link their code commits to the related issue. I suppose the wiki should also remind developers not to check in the build, bin, or lib directories.
The Issues page seems to clearly document the progress of the project and decisions made along the way. It is easy to see the evolution of the system from here, and easy to tell which developer focused on which aspects of the system. Both team members seem to be equal contributors in the planning and implementation.
All of the builds passed CI (with the exception of a few days when the WattDepot server was down) which shows that the developers individually verified all code changes before committing them. Only the first three of a total of 21 code commits didn't have issues attached to them, which is over the 90% benchmark that was set for Issue-Driven Project Management. However, the timing of the there issues and the progression later in the project shows that they learned as they were doing the project.
Next I checked out the code to take a closer look. I followed the instructions in the wiki to create the JavaDoc. The first question I had was about the Commands class, which to me sounds like it would be a collection of classes implementing the Command interface. The documentation said "A utility class for use by Commands that also includes some essential configuration details." Although I understood what this meant, they should not capitalize Commands in the documentation because it is not referring to the class with this name. Renaming the class would avoid the confusion. I didn't find anything else that could be improved in the JavaDocs.
Looking at the source code, I didn't find many issues. The code follows the standards as described on the wiki. There is one thing I would change, and I'm not sure if there is a standard for this or if it is just based on personal preference. This system put all of the testing for the classes implementing the Command interface into one class. I think having a separate Test class for each command would make the system easier to maintain because if a command's behavior needs to be modified in any way, all of the test methods for it are easy to find. If the system was one day expanded to have dozens of commands, searching through a single test file trying to make modifications would become extremely difficult.
I also found a problem when I ran the Jacoco coverage: Many of the test cases are not being executed. When the @Test annotation is followed by an expected Exception, such as @Test(expected = CommandFailedException.class)that line is not run. I have never used expected Exception before so I can't say what the problem is (maybe a missing space?), but it is something that I would not have noticed without running Jacoco. They also might want to find some tests to run on the Processor. The Developer Guide actually gives an example of a test that could be run on the input processing loop, but it isn't implemented in the code.
With the current system, the issues I mentioned wouldn't prevent a developer from being able to understand and enhance the system. So based on my review, this project does meet the Three Prime Directives of Open Source Software, even though there are some improvements that could be made in the next version.
First, the purpose and function of the software system. This project creates an interface to the WattDepot service. WattDepot can collect and store power and energy data from any electrical meter. In this case, we are using data from the meters within the Hale Aloha residential towers at the University of Hawaii-Manoa. Data from each meter can be aggregated into virtual sources, so data can be reported for an entire tower at once even though many meters are used to collect the data. Four commands are built into hale-aloha-cli-jz: current-power, daily-energy, energy-since, and rank-towers. These four commands take a combination of source name and dates as arguments and display the requested data to the user. The system is meant to be easily extendable, so new commands can be added as the need arises.
I will be reviewing this system in regard to the Three Prime Directives of Open Source Software. The three directives are that the software 1) accomplishes a useful task, 2) can successfully be installed and used by a new user, and 3) can successfully be understood and enhanced by a new developer. Throughout the review, I will also pay attention to the use of some software engineering concepts. The first is version control. The code should be in svn, which lets each group member access the code at the same time without worrying about saving over each other's work. It also keeps a complete history of the code in case there is a need to revert to an earlier version. The second concept is Issue-Drive Project Management, in which all work on the system is planned and documented with the Issues feature of Google Project Hosting. I will ensure that issues were small enough that they wouldn't take more than two days to complete, and that the majority of code commits were linked to an issue. Next, I will check Continuous Integration (CI) using Jenkins. The purpose of CI is to ensure that the code base is always in a consistent and working state. Each time code is committed, Jenkins should run all of the automatic and manual tests on the project. If anything goes wrong, the group should have received an email about the problem and fixed it as quickly as possible. I will also check documentation (within the code, in the svn commits, and on the wiki), adherence to code standards, and the quality of JUnit testing.
So without further ado, let's get started.
1. Does the system accomplish a useful task?
I've downloaded and ran hale-aloha-cli-jz. I tested each of the four commands (current-power, daily-energy, energy-since, and rank-towers) along with "help" and "quit." All of the functionality was included and everything responded appropriately. This system does seem useful.
2. Can an external user successfully install and use the program?
For this question, I looked through the documentation on the wiki to see how easy it would be for a new user to learn this system. The User Guide is complete, but I would recommend moving the "Requirements" "Download the Program" "Installing the Program" and "Running the Program" sections above "Basic Use." I do not believe Ant is required to use the program, so that should be removed. The install section says "Set up your environment JAVA_HOME, path, classpath (on Windows)which could be very confusing to a non-technical user. The User Guide might also benefit from an example.
The process for downloading and running the program is very easy since the .jar is included in the available .zip file. I find the system very easy to use because the prompt is clean, mostly validates user input, and has consistent error messages for each command. The only thing I could find that is not checked is whether the date arguments are in the future and whether a range of dates is valid. This is something that could easily be checked by the system and produce a better error message.
cli > energy-since Ilima 2011-12-01
Could not gather needed data.
Reason: org.wattdepot.client.BadXmlException: 400: Range extends beyond sensor data, startTime 2011-12-01T00:00:00.000-10:00, endTime 2011-11-30T16:40:17.921-10:00: Request: GET http://server.wattdepot.org:8190/wattdepot/sources/Ilima/energy/?startTime=2011-12-01T00:00:00-10:00&endTime=2011-11-30T16:40:17.921-10:00
cli > rank-towers 2011-11-27 2011-11-26
For the interval 2011-11-27 to 2011-11-26 energy consumption by tower was:
Ilima 0.0 kWh
Lehua 0.0 kWh
Lokelani 0.0 kWh
Mokihana 0.0 kWh
Also, some commands accept the current date as an argument while others do not. This should either be made consistent or produce an appropriate error message. Overall, a few small enhancements could improve the usability of the system but it is already easy to install and use.
3: Can an external developer successfully understand and enhance the system?
The Developer Guide seems comprehensive. I like that it shows how to run each QA tool and view the reports. One thing that could be expanded on is the use of Issue-Driven Project Management. The wiki instructs developers to create issues for significant code changes, but doesn't define significant and doesn't give any hint as to the preferred scope of an issue. It also doesn't ask developers to link their code commits to the related issue. I suppose the wiki should also remind developers not to check in the build, bin, or lib directories.
The Issues page seems to clearly document the progress of the project and decisions made along the way. It is easy to see the evolution of the system from here, and easy to tell which developer focused on which aspects of the system. Both team members seem to be equal contributors in the planning and implementation.
All of the builds passed CI (with the exception of a few days when the WattDepot server was down) which shows that the developers individually verified all code changes before committing them. Only the first three of a total of 21 code commits didn't have issues attached to them, which is over the 90% benchmark that was set for Issue-Driven Project Management. However, the timing of the there issues and the progression later in the project shows that they learned as they were doing the project.
Next I checked out the code to take a closer look. I followed the instructions in the wiki to create the JavaDoc. The first question I had was about the Commands class, which to me sounds like it would be a collection of classes implementing the Command interface. The documentation said "A utility class for use by Commands that also includes some essential configuration details." Although I understood what this meant, they should not capitalize Commands in the documentation because it is not referring to the class with this name. Renaming the class would avoid the confusion. I didn't find anything else that could be improved in the JavaDocs.
Looking at the source code, I didn't find many issues. The code follows the standards as described on the wiki. There is one thing I would change, and I'm not sure if there is a standard for this or if it is just based on personal preference. This system put all of the testing for the classes implementing the Command interface into one class. I think having a separate Test class for each command would make the system easier to maintain because if a command's behavior needs to be modified in any way, all of the test methods for it are easy to find. If the system was one day expanded to have dozens of commands, searching through a single test file trying to make modifications would become extremely difficult.
I also found a problem when I ran the Jacoco coverage: Many of the test cases are not being executed. When the @Test annotation is followed by an expected Exception, such as @Test(expected = CommandFailedException.class)that line is not run. I have never used expected Exception before so I can't say what the problem is (maybe a missing space?), but it is something that I would not have noticed without running Jacoco. They also might want to find some tests to run on the Processor. The Developer Guide actually gives an example of a test that could be run on the input processing loop, but it isn't implemented in the code.
With the current system, the issues I mentioned wouldn't prevent a developer from being able to understand and enhance the system. So based on my review, this project does meet the Three Prime Directives of Open Source Software, even though there are some improvements that could be made in the next version.
Now that I've had a chance to look at the testing a little more, I think there is a problem with JaCoCo rather than this code. I put a System.out.println into the test case with an expected error. When JaCoCo runs the test cases the message prints out, but the report still says that that line was never executed. This is kind of annoying, but I'll just assume all of the test cases are running and use JaCoCo's reports for the other classes.
ReplyDelete