Engineers write programs to solve problems. Computer scientists write programs to create problems.Neatly summarizes the situation, don't you think?
Wednesday, October 22, 2008
Engineers vs. Computer Scientists
I've been thinking lately about how people in other fields, such as engineering, view programming and computing in general. This morning I think I figured it out:
Friday, October 17, 2008
Picture of the day
This is a screenshot of a "Robot Lawnmower Simulation" that I assigned as a project in a CS2 course I'm teaching.
Here's the link:
http://faculty.ycp.edu/~dhovemey/fall2008/cs201/assign/assign4.html
Sunday, October 5, 2008
max7219 is working
This past week I was able to get the AVR to talk to a max7219 multiplexed LED driver. One thing that took a while to figure out is that to bring the max7219 out of shutdown mode, you write a 1 to the shutdown register, not a 0. Fortunately, putting the chip in test mode (all LEDs lit) works even in shutdown mode, so I was at least able to verify that the chip worked and it was receiving data. After getting it out of shutdown mode, progress was rapid. (In the process of debugging, I also discovered that the max7219 works best when its ground pins are connected to ground :-)
One nice feature of the max7219 is that it supports undecoded output, meaning that you're not limited to decimal, hex, or code-B when using 7-segment displays. Some weird people out there have come up with 7 segment text fonts, which (surprisingly) aren't totally unreadable.
Aside from adding some buttons for user input, pretty much all of the technical details of my top-secret AVR project have been ironed out. (I still have to finish writing the code, but that should be the easy part.)
One interesting wrinkle is that the max7219 doesn't precisely support SPI, but a protocol very much like it. So, I actually have two SPI busses, one for the ds1305, and one for the max7219. (I should have ordered the max7221, which is fully SPI-compatible.) No big deal, since the atmega48/88/168 chips have a lot of I/O pins. (I'm currently using the '48, since it's cheapest.)
One nice feature of the max7219 is that it supports undecoded output, meaning that you're not limited to decimal, hex, or code-B when using 7-segment displays. Some weird people out there have come up with 7 segment text fonts, which (surprisingly) aren't totally unreadable.
Aside from adding some buttons for user input, pretty much all of the technical details of my top-secret AVR project have been ironed out. (I still have to finish writing the code, but that should be the easy part.)
One interesting wrinkle is that the max7219 doesn't precisely support SPI, but a protocol very much like it. So, I actually have two SPI busses, one for the ds1305, and one for the max7219. (I should have ordered the max7221, which is fully SPI-compatible.) No big deal, since the atmega48/88/168 chips have a lot of I/O pins. (I'm currently using the '48, since it's cheapest.)
Thursday, September 25, 2008
SPI success!
I've had no luck getting the atmega168 to talk to a DS1307 RTC chip over i2c. My colleague Greg suggested using an RTC which uses SPI. After getting my hands on a couple DS1305s, I wired it up, and wrote some code to bit-bang SPI using several pins on port C. (I didn't use the hardware SPI support because those pins are used for in-system programming.) I tried it out, and---nothing. Every register read would return 0, which seemed a bit odd.
Eventually, I replaced the DS1305 with some LEDs, and wired the atmega168 pin I was using for input from the DS1305 to +5V, and noticed that even though the LEDs were blinking correctly (I slowed the SPI protocol way down), I was still getting 0 when I did the read.
Long story short, AVRs have different registers for reading and writing ports. For example, PORTC means "write to port C", while PINC means "read from port C". I, of course, was trying to read from the output register. Fixed the code, and presto, my test program worked. All it does is repeatedly writes an incrementing counter to one of the DS1305 RAM addresses, and reads it back.
The obligatory picture:
The DS1305 is the chip just to the right of the atmega168.
Next step: use a MAX7219 to drive the 7-segment displays.
Eventually, I replaced the DS1305 with some LEDs, and wired the atmega168 pin I was using for input from the DS1305 to +5V, and noticed that even though the LEDs were blinking correctly (I slowed the SPI protocol way down), I was still getting 0 when I did the read.
Long story short, AVRs have different registers for reading and writing ports. For example, PORTC means "write to port C", while PINC means "read from port C". I, of course, was trying to read from the output register. Fixed the code, and presto, my test program worked. All it does is repeatedly writes an incrementing counter to one of the DS1305 RAM addresses, and reads it back.
The obligatory picture:
The DS1305 is the chip just to the right of the atmega168.
Next step: use a MAX7219 to drive the 7-segment displays.
Thoughts on the bailout of the financial industry
Spending $700 billion to bail out the financial industry is about $2300 per US citizen.
So, the US government wants to spend $2300 of my money (probably more) to clean up the financial mess left by greedy and stupid people. We're letting the people who knowingly arranged and sold bad mortgages off the hook. We're also (presumably) letting the people who knowingly walked into mortgages they couldn't afford off the hook.
I now feel really stupid for working an honest job and living within my means. Our government seems to be largely in the business of making sure that the unscrupulous prosper.
So, the US government wants to spend $2300 of my money (probably more) to clean up the financial mess left by greedy and stupid people. We're letting the people who knowingly arranged and sold bad mortgages off the hook. We're also (presumably) letting the people who knowingly walked into mortgages they couldn't afford off the hook.
I now feel really stupid for working an honest job and living within my means. Our government seems to be largely in the business of making sure that the unscrupulous prosper.
Friday, September 12, 2008
ATMega168, i2c/twi
I changed by AVR circuit to use an atmega168 rather than the attiny2313. The primary reason is that I'm trying to use the i2c code from the Procyon AVRLib in order to talk to a DS1307 real time clock, and avr-libc doesn't seem to have the required register definitions for the attiny2313. I don't know why - the attiny2313 is supposed to support hardware TWI (really i2c).
Here's a snapshot of the new circuit, taken again with the world's worst digital camera:
My program successfully loads and runs on the atmega168. Next step: actually try to talk to the DS1307.
Here's a snapshot of the new circuit, taken again with the world's worst digital camera:
My program successfully loads and runs on the atmega168. Next step: actually try to talk to the DS1307.
Tuesday, September 9, 2008
Powered by Ubuntu sticker
When I got my new Thinkpad X61 tablet, I (naturally) removed the "Designed for Vista" sticker and threw it away. That got me wondering whether anyone has made a "Designed for Linux" sticker. And of course, the answer is yes, and they're free:
The sticker actually says "Powered by Ubuntu", but that's good enough for me.
http://system76.com/article_info.php?articles_id=9Double w00t!
The sticker actually says "Powered by Ubuntu", but that's good enough for me.
Getting Eclipse to work on 64-bit Linux
Back at the beginning of the summer, I was experiencing strange Eclipse crashes after an upgrade to 64-bit Ubuntu. My solution at the time was to switch over to Netbeans, which worked out very nicely.
Since students here at YCP are using Eclipse for Java development, I thought I'd take another stab at getting things working.
As it turns out, there is an easy solution: use the IBM JDK. Instructions are in a comment in the following blog post:
W00t!
Since the problem seems to be in the Sun JVM, and not Eclipse, I have to wonder why Sun has let this bug linger for such a long time.
Since students here at YCP are using Eclipse for Java development, I thought I'd take another stab at getting things working.
As it turns out, there is an easy solution: use the IBM JDK. Instructions are in a comment in the following blog post:
http://dmartin.org/weblog/eclipse-on-ubuntu-linux-for-amd64Basically, you grab the RPM for the IBM JDK, use alien to install it, then use the -vm Eclipse switch to launch it using the IBM JVM.
W00t!
Since the problem seems to be in the Sun JVM, and not Eclipse, I have to wonder why Sun has let this bug linger for such a long time.
Friday, September 5, 2008
Improved Blinkenlights
My AVR-fu is slowly improving.
I'm now using the AVR-PG1B programming cable I ordered from SparkFun, along with a DB9 cable and a breakout board that adapts the 10-pin AVR ISP header to a single-row 6 pin header, which then plugs into my breadboard with a cable and gender changer from Digilent. Very nice, works like a champ. (Note: avrdude calls this programmer "ponyser".)
I constructed a circuit interfacing the attiny2313 with an ICM7212 7-segment LED driver. So far, I only have 2 of the 4 supported digits wired, but it works nicely. Here's a photo (taken again with a very crappy camera) of the whole mess:
There's still plenty of room on the breadboard for another IC and a few switches, which are needed for my crazy top-secret project.
I'm now using the AVR-PG1B programming cable I ordered from SparkFun, along with a DB9 cable and a breakout board that adapts the 10-pin AVR ISP header to a single-row 6 pin header, which then plugs into my breadboard with a cable and gender changer from Digilent. Very nice, works like a champ. (Note: avrdude calls this programmer "ponyser".)
I constructed a circuit interfacing the attiny2313 with an ICM7212 7-segment LED driver. So far, I only have 2 of the 4 supported digits wired, but it works nicely. Here's a photo (taken again with a very crappy camera) of the whole mess:
There's still plenty of room on the breadboard for another IC and a few switches, which are needed for my crazy top-secret project.
Tuesday, September 2, 2008
Blinkenlights success!
I couldn't wait for my AVR programming cable to arrive from Sparkfun, so I decided to build a parallel cable.
Long story short: it works. I used the pinout suggested by http://www.bsdhome.com/avrdude/, which in avrdude is the "bsd" programming device. Although that site suggested 1K resistors in series with the signals from the parallel port (to protect it from current flow from the AVR circuit), I couldn't get the device to program successfully with them in place. So, I just used wires :-)
For the cable, I soldered 5 wires of an 8-wire ribbon cable to the DB25 connector, and crimped a 16-pin socket-style connector onto the other end. I don't know exactly what those connectors are called, but they're designed to fit into an IC socket. That makes them nice for breadboards, which can't really accept dual row (IDC) headers directly.
For the program, I used the code from Elliot Williams's excellent (no, inspiring) piece on cheap AVR programming. I didn't build his development board/programming cradle (did I mention that I hate soldering?), but I used the same circuit.
Here's an amazingly crappy digital photo of the circuit and the programming cable:
Dave Babcock and Greg Link helped me put the whole thing together. It's really nice working with actual engineers :-)
Long story short: it works. I used the pinout suggested by http://www.bsdhome.com/avrdude/, which in avrdude is the "bsd" programming device. Although that site suggested 1K resistors in series with the signals from the parallel port (to protect it from current flow from the AVR circuit), I couldn't get the device to program successfully with them in place. So, I just used wires :-)
For the cable, I soldered 5 wires of an 8-wire ribbon cable to the DB25 connector, and crimped a 16-pin socket-style connector onto the other end. I don't know exactly what those connectors are called, but they're designed to fit into an IC socket. That makes them nice for breadboards, which can't really accept dual row (IDC) headers directly.
For the program, I used the code from Elliot Williams's excellent (no, inspiring) piece on cheap AVR programming. I didn't build his development board/programming cradle (did I mention that I hate soldering?), but I used the same circuit.
Here's an amazingly crappy digital photo of the circuit and the programming cable:
Dave Babcock and Greg Link helped me put the whole thing together. It's really nice working with actual engineers :-)
Sunday, August 31, 2008
Where to buy LEDs
While on the subject of electronics, here are some good places to buy LEDs:
MPJA (www.mpja.com) - you can get 100 LEDs (red, green, or yellow) for $1.95. But, there is a $15.00 minimum order. I bought a breadboard from them recently, and the order was filled quickly, so they seem like a pretty reputable establishment.
Jameco (www.jameco.com) - item number 334052 is an orange LED that is $2.20 for 100. Other colors are more expensive, however. I've bought stuff from them in the past, and they're reputable, but their website is kind of a pain to use.
The discrepancies between what different places charge for an LED - a plain vanilla component if there ever was one - really amaze me. At Sparkfun, you pay $0.35 for a green one and $0.50 for a red or yellow. That's right, more than 10 times more expensive!
MPJA (www.mpja.com) - you can get 100 LEDs (red, green, or yellow) for $1.95. But, there is a $15.00 minimum order. I bought a breadboard from them recently, and the order was filled quickly, so they seem like a pretty reputable establishment.
Jameco (www.jameco.com) - item number 334052 is an orange LED that is $2.20 for 100. Other colors are more expensive, however. I've bought stuff from them in the past, and they're reputable, but their website is kind of a pain to use.
The discrepancies between what different places charge for an LED - a plain vanilla component if there ever was one - really amaze me. At Sparkfun, you pay $0.35 for a green one and $0.50 for a red or yellow. That's right, more than 10 times more expensive!
AVR microcontroller stuff
I'm pretty close to having the parts I need to start my crazy AVR microcontroller project. (Note to self: in the future, don't order parts from Thailand if you want them to arrive in a timely fashion :-) At the moment I'm just waiting for a programming cable, then I'll be off and running. I was originally going to build a programming cable, but after reflecting on my lack of soldering skills, I decided this wasn't a great idea. The one I ordered (from Sparkfun, good place to buy AVR stuff, BTW) was the AVR-PG1B, which connects to a serial port on your PC, and allegedly is supported by avrdude. I also ordered a nifty little breakout board which adapts the 10 pin AVR ISP connector (2x5 pins) to a single row of six pins which can be plugged into a breadboard. I'll need to get some help soldering it, however.
While waiting for the programmer to arrive, I've been poking around the 'net looking at various AVR tutorials. The best one I've come across is at Sparkfun:
When I get some blinkenlights working I'll post a picture.
While waiting for the programmer to arrive, I've been poking around the 'net looking at various AVR tutorials. The best one I've come across is at Sparkfun:
Beginning Embedded ElectronicsThere's a bunch of stuff at the beginning about building a power supply circuit using a 7805 voltage regulator, but you can save that effort if you have a good switching 5V wall wart power supply. Digilent has a good one (part number SWPS). They also have a nice cable for getting the output of the power supply to a pair of leads (part number COAXPOWER). Beware though: the ends of the leads are tinned with solder, so you don't want to stick them directly into a breadboard, lest bits of solder flake off and cause shorts. A terminal block solves this problem nicely.
When I get some blinkenlights working I'll post a picture.
Friday, August 22, 2008
The rest2web template of awesomeness
Now that classes are starting again, it's time to create course web pages. As I mentioned earlier, I'm using rest2web to generate most of the course web pages.
One really nice feature of rest2web is that the template page (used as a basis for all generated web pages) can contain arbitrary chunks of python code. One thing I noticed about my site design was that the sidebar (containing navigation links) was nice for index documents, but distracting for "leaf" documents such as assignment descriptions, lecture notes, etc.
So, I put in a few lines of code to check the page being generated (available from the pagename variable: see the rest2web template documentation) to see if it's an index page, and if so, suppress the sidebar. I needed to futz a tiny bit with the CSS styles, but overall it was an extremely easy change.
My CS 200 page shows how the generated sites look. Given how much easier it is to author reStructuredText documents than HTML, I think I'm getting awfully close to web content nirvana.
Feel free to use/modify/steal my rest2web template and CSS stylesheet files.
One really nice feature of rest2web is that the template page (used as a basis for all generated web pages) can contain arbitrary chunks of python code. One thing I noticed about my site design was that the sidebar (containing navigation links) was nice for index documents, but distracting for "leaf" documents such as assignment descriptions, lecture notes, etc.
So, I put in a few lines of code to check the page being generated (available from the pagename variable: see the rest2web template documentation) to see if it's an index page, and if so, suppress the sidebar. I needed to futz a tiny bit with the CSS styles, but overall it was an extremely easy change.
My CS 200 page shows how the generated sites look. Given how much easier it is to author reStructuredText documents than HTML, I think I'm getting awfully close to web content nirvana.
Feel free to use/modify/steal my rest2web template and CSS stylesheet files.
Tuesday, August 19, 2008
Thinkpad X61 Tablet
I'm now using a Lenovo Thinkpad X61 Tablet as my main computer at school. So far, I like it a lot. I'm dual booting Windows XP and Kubuntu 8.04. I was able to get the tablet functionality working, with automatic screen rotation (nifty!), using information on the following web page:
You have to have the "hdaps_ec" module loaded. I put it in my /etc/modules file.
You also have to edit /etc/X11/xorg.conf. Here's a link to the one I'm using (caveat emptor):
http://www.krizka.net/2008/02/13/thinkpad-x61-tablet-automatic-screen-rotation-under-linux/If you look down at the comments section, there's some information on running on Kubuntu (KDE) instead of Ubuntu (Gnome).
You have to have the "hdaps_ec" module loaded. I put it in my /etc/modules file.
You also have to edit /etc/X11/xorg.conf. Here's a link to the one I'm using (caveat emptor):
http://faculty.ycp.edu/~dhovemey/thinkpadX61t/xorg.confFinally, the "autorotate.py" script has a regular expression bug (fix is in the comments). Here's the fixed version:
http://faculty.ycp.edu/~dhovemey/thinkpadX61t/autorotate.pyThe stylus is not calibrated terribly well, and the Debian/Ubuntu folks somehow forgot to include the utility that calibrates it. However, it works acceptably. I used the Xournal program to jot some notes, and while it's not exactly like writing on paper, it's not bad at all. I may try to use this in class rather than writing on the whiteboard.
Thursday, August 7, 2008
gschem vs. kicad
I have officially given up trying to use gschem. I can place components (and even create new symbols), but I have had no success connecting pins with wires. It is possible that I am an idiot, but I don't think I have to go too far out on a limb to claim that gschem isn't the most user-friendly of programs.
Fortunately, there appears to be a nice alternative called kicad. The problem of parts I need (e.g., ATtiny2313, ICM7212) not being in the default symbol library is still an issue, although some web searching did eventually turn up symbols for the parts I needed.
Update: I was able to connect VCC on the ATtiny2313 to +5V! I'm going to tentatively say that kicad rocks.
Fortunately, there appears to be a nice alternative called kicad. The problem of parts I need (e.g., ATtiny2313, ICM7212) not being in the default symbol library is still an issue, although some web searching did eventually turn up symbols for the parts I needed.
Update: I was able to connect VCC on the ATtiny2313 to +5V! I'm going to tentatively say that kicad rocks.
Wednesday, August 6, 2008
rest2web
I've written a lot of HTML in the past few years, especially for course web pages. I've used two basic techniques:
Technique #2 is better than #1, but Kompozer (which was derived from the Mozilla editor) has its own share of quirks. It's buggy, and it generates ugly HTML (it wants to put <br> tags all over the #^!%#! place). I've been willing to live with its flaws, and it's saved me a lot of time. However, it doesn't help in automating the creation of navigational elements (e.g., breadcrumbs, sidebars).
Enter reStructuredText and rest2web. The former is a wiki-like lightweight markup language: a reST document looks more or less like plain text, but can easily be turned into HTML via a nifty python utility. The latter is a site-creation tool which uses reST and some additional lightweight metadata to create complete websites from reST files. It took a couple hours to learn my way around, but I was able to produce a very spiffy-looking site from (essentially) plain text files. Sweet!
Now I just need to create some content :-)
- Hand-written HTML using a text editor. Blech.
- Editing HTML documents using Kompozer.
Technique #2 is better than #1, but Kompozer (which was derived from the Mozilla editor) has its own share of quirks. It's buggy, and it generates ugly HTML (it wants to put <br> tags all over the #^!%#! place). I've been willing to live with its flaws, and it's saved me a lot of time. However, it doesn't help in automating the creation of navigational elements (e.g., breadcrumbs, sidebars).
Enter reStructuredText and rest2web. The former is a wiki-like lightweight markup language: a reST document looks more or less like plain text, but can easily be turned into HTML via a nifty python utility. The latter is a site-creation tool which uses reST and some additional lightweight metadata to create complete websites from reST files. It took a couple hours to learn my way around, but I was able to produce a very spiffy-looking site from (essentially) plain text files. Sweet!
Now I just need to create some content :-)
Saturday, August 2, 2008
djboxsym - a better way to create gschem symbols
OK, after my recent frustrations trying to use tragesym to make a gschem symbol for the ICM7212, I tried out djboxsym.
To make a long story short, it took about 15 minutes to do what I needed. Sweet. I used the web interface, which is very cool because you can immediately preview your work.
For posterity, my extremely modest efforts are available:
icm7212.sym
icm7212.symdef
Caveat emptor.
To make a long story short, it took about 15 minutes to do what I needed. Sweet. I used the web interface, which is very cool because you can immediately preview your work.
For posterity, my extremely modest efforts are available:
icm7212.sym
icm7212.symdef
Caveat emptor.
Friday, August 1, 2008
The sad and alarming state of circuit design tools in Linux
So, for fun I'm playing around with microcontrollers. I want to enter a schematic for the circuit I'm designing.
I tried out Eagle CAD, but there is no symbol for the microcontroller I'm using (the Atmel ATtiny2313). There are some user-contributed libraries for Atmel parts, but I don't see that part anywhere. Bleh.
Unhappily, I don't see the ICM7212 in either the built-in library, or at gedasymbols.org.
Blech. OK, how easy is it to add a new symbol? It's a plain, boring old 40 pin DIP. How hard could this be?
Well, it wasn't as easy as I had hoped. The state of the art is described in a tutorial. Here's a brief synopsis:
The icing on the cake is this: the spreadsheet has pre-defined spaces for only 16 pins. I'm trying to create a 40-pin part. Being a programmer, I'm not going to sit there are fill in cells one at a time (17, 18, 19, etc...). I'll just define a spreadsheet formula that will generate these pin numbers for me, and copy them into as many cells as needed. The author of the spreadsheet, helpfully, disabled support for formulas in the entire spreadsheet. I honestly didn't even know this was possible!
Someone please just shoot me, or at least jab me with something sharp.
There seems to be a simpler approach, djboxsym, which looks like it dispenses with some of the unnecessary complexity.
I tried out Eagle CAD, but there is no symbol for the microcontroller I'm using (the Atmel ATtiny2313). There are some user-contributed libraries for Atmel parts, but I don't see that part anywhere. Bleh.
[Aside: the ATtiny2313 is only an insanely popular and widely-used part, which has only been available for 3 years now. I can totally understand why it's not in any readily-available part library.]So, I try out gschem from the gEDA package. Free software is always better, of course. The ATtiny2313 part isn't in the library, but I find it at gedasymbols.org. Yay.
Unhappily, I don't see the ICM7212 in either the built-in library, or at gedasymbols.org.
Blech. OK, how easy is it to add a new symbol? It's a plain, boring old 40 pin DIP. How hard could this be?
Well, it wasn't as easy as I had hoped. The state of the art is described in a tutorial. Here's a brief synopsis:
- Download a text file and an OpenOffice spreadsheet file. (The text file is not, as far as I can see, mentioned again in the tutorial.)
- Edit the spreadsheet to fill in hundreds of mysterious labels.
- Enter the function names,pin types, etc. for each pin.
- Save the spreadsheet as comma-separated text.
- Run the CSV through a python script, which creates a schematic file.
- Edit the schematic file using gschem. (Why? Wish I knew.)
- Save it as a symbol file.
The icing on the cake is this: the spreadsheet has pre-defined spaces for only 16 pins. I'm trying to create a 40-pin part. Being a programmer, I'm not going to sit there are fill in cells one at a time (17, 18, 19, etc...). I'll just define a spreadsheet formula that will generate these pin numbers for me, and copy them into as many cells as needed. The author of the spreadsheet, helpfully, disabled support for formulas in the entire spreadsheet. I honestly didn't even know this was possible!
Someone please just shoot me, or at least jab me with something sharp.
There seems to be a simpler approach, djboxsym, which looks like it dispenses with some of the unnecessary complexity.
Monday, July 28, 2008
Obligation analysis data
Here are the FindBugs analysis results files for my recent use of obligation analysis to look for unclosed stream bugs in Vuze and jEdit:
http://faculty.ycp.edu/~dhovemey/findbugs/vuze-obl.fba
http://faculty.ycp.edu/~dhovemey/findbugs/jedit-obl.fba
I classified the warnings as NEEDS_ANALYSIS if I considered the code apparently correct, and the warning could be avoided through the use of annotations (@WillClose, @WillCloseWhenClosed, etc.) These were cases where if a called method (or object) failed to close the stream, then there truly would be a bug.
I classified warnings as SHOULD_FIX if I considered that it was possible for the code to fail to close a stream.
I classified the warnings as ANALYSIS_ERROR if the warning was completely erroneous.
http://faculty.ycp.edu/~dhovemey/findbugs/vuze-obl.fba
http://faculty.ycp.edu/~dhovemey/findbugs/jedit-obl.fba
I classified the warnings as NEEDS_ANALYSIS if I considered the code apparently correct, and the warning could be avoided through the use of annotations (@WillClose, @WillCloseWhenClosed, etc.) These were cases where if a called method (or object) failed to close the stream, then there truly would be a bug.
I classified warnings as SHOULD_FIX if I considered that it was possible for the code to fail to close a stream.
I classified the warnings as ANALYSIS_ERROR if the warning was completely erroneous.
Friday, July 25, 2008
Obligation analysis: success!
The implementation of obligation analysis in FindBugs seems to be in a useful state.
The analysis found about 8 bugs related to unclosed streams in FindBugs itself. If you write tools to find bugs, people always ask you if the tool finds bugs in itself. Well, FindBugs certainly does on a regular basis.
I analyzed Vuze (formerly Azureus), and the detector reported 35 warnings. Of those warnings, 17 appear to be legitimate issues, and another 17 are probably benign warnings that could be eliminated through the use of the JSR-305 @WillClose or @WillCloseWhenClosed annotations. (These annotations are used to specify methods and objects that assume responsibility for closing a resource.) 1 warning was essentially a duplicate of another (apparently correct) warning.
Analysis of jEdit was not quite as impressive, but still interesting: 4 apparent bugs, 9 warnings about probably-correct code that could be eliminated by annotations, and 3 cases where the analysis was wrong. (I need to investigate the last category.)
One type of false positive the paper didn't mention (that I can recall) was when one resource object "wraps" another. This, of course, is a common design pattern (Adapter) used in the java.io package.
Example:
The @WillCloseWhenClosed annotation would fix this problem (explicitly specifying the "transfer" of one obligation type to another), but since JSR-305 is not official yet, the standard Java classes don't use this annotation.
I worked around this issue by having the detector find likely places where an obligation transfer occurs, and then checking to see if the unmet obligation can be explained by an obligation transfer. This heuristic seems to work fairly well in practice.
Interestingly, a similar issue occurs when the "wrapped" resource is closed instead of the "wrapper". Technically, this could be considered a bug (the "wrapper" resource's close() method might have extra work it wants to do), but in many cases this is also a correct approach. The same heuristic (looking for probable obligation transfers) seems to be effective.
The analysis found about 8 bugs related to unclosed streams in FindBugs itself. If you write tools to find bugs, people always ask you if the tool finds bugs in itself. Well, FindBugs certainly does on a regular basis.
I analyzed Vuze (formerly Azureus), and the detector reported 35 warnings. Of those warnings, 17 appear to be legitimate issues, and another 17 are probably benign warnings that could be eliminated through the use of the JSR-305 @WillClose or @WillCloseWhenClosed annotations. (These annotations are used to specify methods and objects that assume responsibility for closing a resource.) 1 warning was essentially a duplicate of another (apparently correct) warning.
Analysis of jEdit was not quite as impressive, but still interesting: 4 apparent bugs, 9 warnings about probably-correct code that could be eliminated by annotations, and 3 cases where the analysis was wrong. (I need to investigate the last category.)
One type of false positive the paper didn't mention (that I can recall) was when one resource object "wraps" another. This, of course, is a common design pattern (Adapter) used in the java.io package.
Example:
The analysis assumes that the InputStream is the obligation needing to be cleaned up, but the finally block closes the Reader instead.InputStream in = new FileInputStream(filename);
Reader r = new InputStreamReader(in);
try {
...
r.read()
...
} finally {
r.close();
}
The @WillCloseWhenClosed annotation would fix this problem (explicitly specifying the "transfer" of one obligation type to another), but since JSR-305 is not official yet, the standard Java classes don't use this annotation.
I worked around this issue by having the detector find likely places where an obligation transfer occurs, and then checking to see if the unmet obligation can be explained by an obligation transfer. This heuristic seems to work fairly well in practice.
Interestingly, a similar issue occurs when the "wrapped" resource is closed instead of the "wrapper". Technically, this could be considered a bug (the "wrapper" resource's close() method might have extra work it wants to do), but in many cases this is also a correct approach. The same heuristic (looking for probable obligation transfers) seems to be effective.
Thursday, July 17, 2008
Don't use the Intel CPU fan
I have built two systems using Intel socket 775 CPUs recently. Last summer, I built one using a Pentium Dual Core E2140 (1.6 GHz), which I use at school. This summer, I built one using a Pentium Dual Core E2220 (2.4 GHz), which I use at home.
Putting together the newer one, I managed to break off one of the "legs" of the stock Intel CPU fan. So, I bought a cheap third-party CPU fan at a local computer store. The older system I use at school has the stock Intel CPU fan installed.
Interestingly, my home machine runs much cooler than the system at school. Even under load, neither core gets above 45C, and each core idles around 30C.
The system at school idles around 45C, and reaches close to 60C under load, even though the CPU is running nearly 1 GHz more slowly than my home machine. Each machine has a cheapo ATX case with a case fan (in addition to the power supply fan.)
This strikes me as odd: if you follow Intel's installation instructions exactly, you get pretty inadequate cooling. I know the customer reviews on newegg always say that the Intel CPU fans suck. Well, I guess they do!
Putting together the newer one, I managed to break off one of the "legs" of the stock Intel CPU fan. So, I bought a cheap third-party CPU fan at a local computer store. The older system I use at school has the stock Intel CPU fan installed.
Interestingly, my home machine runs much cooler than the system at school. Even under load, neither core gets above 45C, and each core idles around 30C.
The system at school idles around 45C, and reaches close to 60C under load, even though the CPU is running nearly 1 GHz more slowly than my home machine. Each machine has a cheapo ATX case with a case fan (in addition to the power supply fan.)
This strikes me as odd: if you follow Intel's installation instructions exactly, you get pretty inadequate cooling. I know the customer reviews on newegg always say that the Intel CPU fans suck. Well, I guess they do!
Monday, July 14, 2008
Obligation analysis
A common form of runtime error in Java programs is not closing or freeing an acquired resource on all paths out of a method. This kind of error is especially common with i/o streams, but also affects database resources, JSR-166 lock objects, etc.
FindBugs has a couple detectors that I wrote quite a while ago for detecting such errors. The detectors use a rather ad-hoc analysis, and produce a variety of annoying false positives.
Wes Weimer and George Necula proposed a nice static analysis to find such errors at OOPSLA 2004. I am finally getting around to getting this analysis implemented in FindBugs. Their analysis tracks obligations (open streams, db connections, etc.) on (effectively) all acyclic paths through methods, the basic idea being that every acyclic path ought to discharge all of its obligations. The analysis does not attempt to track the actual resource values through variables and heap locations. Instead, it just checks that each resource acquisition reaches an appropriate resource de-allocation.
I think I have finally gotten to the point where I understand how the analysis works, and the initial implementation in FindBugs seems to be working. I still need to complete the database of method calls which create or discharge obligations, and also implement several post-processing steps for false positive elimination, but I don't think this will be a huge amount of work.
FindBugs has a couple detectors that I wrote quite a while ago for detecting such errors. The detectors use a rather ad-hoc analysis, and produce a variety of annoying false positives.
Wes Weimer and George Necula proposed a nice static analysis to find such errors at OOPSLA 2004. I am finally getting around to getting this analysis implemented in FindBugs. Their analysis tracks obligations (open streams, db connections, etc.) on (effectively) all acyclic paths through methods, the basic idea being that every acyclic path ought to discharge all of its obligations. The analysis does not attempt to track the actual resource values through variables and heap locations. Instead, it just checks that each resource acquisition reaches an appropriate resource de-allocation.
I think I have finally gotten to the point where I understand how the analysis works, and the initial implementation in FindBugs seems to be working. I still need to complete the database of method calls which create or discharge obligations, and also implement several post-processing steps for false positive elimination, but I don't think this will be a huge amount of work.
Friday, July 11, 2008
Ruby on Rails in Netbeans
Netbeans is slowly becoming my favorite IDE. (Sorry, Eclipse!)
Today I started using the Ruby on Rails support within Netbeans, and it's quite nice. I'm probably not doing anything terribly sophisticated, but I did manage to create and run migrations, create some controllers and views, and launch the app, all from within Netbeans. Eclipse probably has support for all this stuff, but due to unexplained Eclipse crashes (on Linux), I can't actually use it.
I'm using the JRuby plugin for Netbeans, which means my rails code is actually running in Java. Kinda nice - Java is a more ubiquitous runtime environment that Ruby, so I'm thinking this will be helpful when it comes to deployment time.
Today I started using the Ruby on Rails support within Netbeans, and it's quite nice. I'm probably not doing anything terribly sophisticated, but I did manage to create and run migrations, create some controllers and views, and launch the app, all from within Netbeans. Eclipse probably has support for all this stuff, but due to unexplained Eclipse crashes (on Linux), I can't actually use it.
I'm using the JRuby plugin for Netbeans, which means my rails code is actually running in Java. Kinda nice - Java is a more ubiquitous runtime environment that Ruby, so I'm thinking this will be helpful when it comes to deployment time.
Monday, June 9, 2008
Eclipse weirdness, NetBeans to the rescue
Last week, I did a hardware upgrade on my home PC. Originally, I had an EliteGroup 848P-A motherboard with a Pentium 4 2.8GHz (Prescott) with 2 GB of DDR 400 RAM. The new configuration has a Gigabyte GA-P35-S3G motherboard with a Pentium Dual Core E2220 (2.4GHz) and 4 GB of DDR2 800 RAM. It was a pretty cheap upgrade, and performance on compute-intensive tasks seems to be about 2x faster. Kubuntu 8.04 recognized all of the new hardware; no reconfiguration was necessary.
Weirdly, there is one important application that no longer works following the upgrade: Eclipse. I get repeated segfaults in libjvm.so. As far as I can tell, it's not a hardware problem. All other applications I have tried have been 100% stable, my CPU temperature has not exceeded 41 C for either core, memtest86+ did not find any problems with the memory, etc.
So, I conclude that it's some sort of software problem. Could it be a weird interaction between SWT and gtk+? This is where native code really sucks.
In the meantime, I'm using Netbeans for Java development. It's gotten quite a bit better since the last time I used it. It's maybe not quite as polished as Eclipse, but the important features (code completion, cross-referencing, and refactoring) are there.
Weirdly, there is one important application that no longer works following the upgrade: Eclipse. I get repeated segfaults in libjvm.so. As far as I can tell, it's not a hardware problem. All other applications I have tried have been 100% stable, my CPU temperature has not exceeded 41 C for either core, memtest86+ did not find any problems with the memory, etc.
So, I conclude that it's some sort of software problem. Could it be a weird interaction between SWT and gtk+? This is where native code really sucks.
In the meantime, I'm using Netbeans for Java development. It's gotten quite a bit better since the last time I used it. It's maybe not quite as polished as Eclipse, but the important features (code completion, cross-referencing, and refactoring) are there.
Thursday, May 29, 2008
Summer!
Hooray, it's summer! (I define summer as the period of time between Spring and Fall semesters, not by the progress of the earth around the sun.)
I'm setting up my home machine to do some work on FindBugs over the summer, and since my Ubuntu 6.10 was getting a bit stale, I decided to upgrade. I happened to have a CD burned with Kubuntu 8.04, so I backed up my essential files and let the installer rip. So far, it seems nice. It took a bit of getting used to KDE rather than GNOME. Overall, KDE seems less polished than GNOME, but more configurable. I'm using Amarok to play my music files, and it appears to be significantly better than Rhythmbox. (See previous post to see my ranting about how much I dislike Rhythmbox.)
Out of curiosity, I installed the Ubuntu openjdk package, and tried running Eclipse on top of it. So far, it seems to work quite well! It's exciting to finally have a usable free Java implementation. Major kudos to Sun for open sourcing the JDK.
There is a weird bug on Ubuntu/Kubuntu 8.04 with Eclipse: here is the bug report. The workaround described in the bug report does seem to fix the problem.
I bought a new monitor, an Acer AL1916. Newegg was having a special for $159, with free shipping. Now (at long last) both of my monitors are the same size and resolution.
I finally started some work on FindBugs today. First project: implementing exclusive type qualifiers. (This is part of implementing support for JSR 305 type qualifiers in FindBugs.) Bill Pugh has a nice presentation about JSR 305 which explains all of the goodness.
I'm setting up my home machine to do some work on FindBugs over the summer, and since my Ubuntu 6.10 was getting a bit stale, I decided to upgrade. I happened to have a CD burned with Kubuntu 8.04, so I backed up my essential files and let the installer rip. So far, it seems nice. It took a bit of getting used to KDE rather than GNOME. Overall, KDE seems less polished than GNOME, but more configurable. I'm using Amarok to play my music files, and it appears to be significantly better than Rhythmbox. (See previous post to see my ranting about how much I dislike Rhythmbox.)
Out of curiosity, I installed the Ubuntu openjdk package, and tried running Eclipse on top of it. So far, it seems to work quite well! It's exciting to finally have a usable free Java implementation. Major kudos to Sun for open sourcing the JDK.
There is a weird bug on Ubuntu/Kubuntu 8.04 with Eclipse: here is the bug report. The workaround described in the bug report does seem to fix the problem.
I bought a new monitor, an Acer AL1916. Newegg was having a special for $159, with free shipping. Now (at long last) both of my monitors are the same size and resolution.
I finally started some work on FindBugs today. First project: implementing exclusive type qualifiers. (This is part of implementing support for JSR 305 type qualifiers in FindBugs.) Bill Pugh has a nice presentation about JSR 305 which explains all of the goodness.
Tuesday, May 6, 2008
Rhythmbox
I've been using Rhythmbox for a while to play my music files (which are, of course, in Ogg Vorbis format.)
I hate to say it, but I have become so frustrated with Rhythmbox that I'm now actively looking for a replacement. Here are my main gripes:
Gripe #1: When you toggle between the "small display" and the full size display, the window size chosen is always wrong. What I expect to happen is that whatever window size I configure in the two modes, Rhythmbox will remember my decision. For f***'s sake, would this be so hard to implement? Here's what actually happens: when switching from the small display to full display, the full display gets a hard-coded height of about a third of my display height. Here's a screenshot:
As you can see, the various lists (artist, album, tracks) are completely squashed. THIS SUCKS!!!! (As a bonus bug, you'll notice that in Ubuntu 7.10, gimp is no longer able to capture screen shots that include the window decorations.)
When switching from the full display back to the small display, sometimes the size is restored correctly, and sometimes the width of the full display is preserved (meaning that you get an extremely wide small display):
Nice work, rhythmbox!
Gripe #2: When an album finishes playing and you click "Play" again, it starts playing from the last track, not the first. Yeah, that's just what I wanted to do.
Gripe #3: If you click "Previous" too quickly, playing stops altogether, even if you haven't reached the first track yet.
One of the reasons I have been an enthusiastic user of free software over the past 15 years or so is that it generally places a high value on correctness and utility over bells and whistles. It concerns me greatly that the free software world is moving towards a Windows model where every application is skinnable, animated out the wazoo, has a feature list the size of a telephone book, and is impossible to use for more than 2 minutes without uncovering a serious bug.
I hate to say it, but I have become so frustrated with Rhythmbox that I'm now actively looking for a replacement. Here are my main gripes:
Gripe #1: When you toggle between the "small display" and the full size display, the window size chosen is always wrong. What I expect to happen is that whatever window size I configure in the two modes, Rhythmbox will remember my decision. For f***'s sake, would this be so hard to implement? Here's what actually happens: when switching from the small display to full display, the full display gets a hard-coded height of about a third of my display height. Here's a screenshot:
As you can see, the various lists (artist, album, tracks) are completely squashed. THIS SUCKS!!!! (As a bonus bug, you'll notice that in Ubuntu 7.10, gimp is no longer able to capture screen shots that include the window decorations.)
When switching from the full display back to the small display, sometimes the size is restored correctly, and sometimes the width of the full display is preserved (meaning that you get an extremely wide small display):
Nice work, rhythmbox!
Gripe #2: When an album finishes playing and you click "Play" again, it starts playing from the last track, not the first. Yeah, that's just what I wanted to do.
Gripe #3: If you click "Previous" too quickly, playing stops altogether, even if you haven't reached the first track yet.
One of the reasons I have been an enthusiastic user of free software over the past 15 years or so is that it generally places a high value on correctness and utility over bells and whistles. It concerns me greatly that the free software world is moving towards a Windows model where every application is skinnable, animated out the wazoo, has a feature list the size of a telephone book, and is impossible to use for more than 2 minutes without uncovering a serious bug.
Wednesday, April 2, 2008
Greatest Rock 'n Roll Song Ever?
I've had a theory for a while that the best Rock song ever written is "Memphis, Egypt" by the Mekons. Trying to describe it in words is pointless, so go listen to it if you haven't heard it. It's on the album The Mekons Rock 'n Roll. I've had the privilege of hearing them play it live, and all I can say is, holy ****.
There are a few songs that I think are almost as good as "Memphis, Egypt". "Club Mekon", also by the Mekons, comes close, and actually follows immediately after "Memphis, Egypt" on the same album! This has to be the greatest two-song sequence ever recorded.
"The Headmaster Ritual" by The Smiths (on Meat is Murder) is possibly a better song than "Memphis, Egypt", but loses some points for the mechanical production that saps much of the energy from the track. (Aside: the best Smiths album is Hatful of Hollow because it is (mostly) a collection of tracks from radio shows, and captures the manic energy of the band much better than any of their studio output. But you knew that.)
I recently discovered on Pandora a song called "I Stare Out..." by The Verlaines which I think could be as good as all of the previously mentioned songs; at the moment, I consider it a major discovery. (Had you ever heard of The Verlaines? I hadn't.)
There are a few songs that I think are almost as good as "Memphis, Egypt". "Club Mekon", also by the Mekons, comes close, and actually follows immediately after "Memphis, Egypt" on the same album! This has to be the greatest two-song sequence ever recorded.
"The Headmaster Ritual" by The Smiths (on Meat is Murder) is possibly a better song than "Memphis, Egypt", but loses some points for the mechanical production that saps much of the energy from the track. (Aside: the best Smiths album is Hatful of Hollow because it is (mostly) a collection of tracks from radio shows, and captures the manic energy of the band much better than any of their studio output. But you knew that.)
I recently discovered on Pandora a song called "I Stare Out..." by The Verlaines which I think could be as good as all of the previously mentioned songs; at the moment, I consider it a major discovery. (Had you ever heard of The Verlaines? I hadn't.)
Thursday, March 13, 2008
Today's riddle: why does CUP emit the generated parser as TWO classes?
OK, so when you use CUP to generate a parser, it emits two classes: a parser class, and an action class (which contains the code generated for the semantic actions associated with the productions of your grammar). These are totally separate classes.
Today's riddle is this:
The action code can refer to the parser via a field called parser, but that's only useful for calling public methods on the parser object. But that means that any internal parser state/operations that the semantic actions want to access must be exposed as public, violating encapsulation.
Blech.
Today's riddle is this:
Why is the code for the semantic actions generated in a separate class?Here are some possible answers:
- Your guess is as good as mine.
- Look in the user manual to find out --- oh wait, the user manual doesn't explain this.
- To ensure that you must violate encapsulation in order to allow semantic actions to refer to internal parser operations?
The action code can refer to the parser via a field called parser, but that's only useful for calling public methods on the parser object. But that means that any internal parser state/operations that the semantic actions want to access must be exposed as public, violating encapsulation.
Blech.
Wednesday, March 12, 2008
GUI Builders for Eclipse
The time has come to talk about GUIs in the Software Engineering course I'm teaching currently, which led me to revisit using a GUI builder in Eclipse. (I will not hand-code a Swing GUI. I just won't.)
Previously I've used the Eclipse Visual Editor plugin, which, while not perfect, generally gets the job done. To my dismay, the current release of the VE does not work with the current stable release of Eclipse, and there has not been an official release of the VE since June 2006. (So sadly neglected / and often ignored / a far second to Belgium / when going abroad / Finland, Finland, Finland :-) I hope this project gets reinvigorated at some point, but I wasn't going to sit on my hands waiting for that to happen.
A quick google search turned up Jigloo, an Eclipse-based GUI builder which, while not free software, is gratis for non-commercial use. From my 10 minutes or so of using it, it appears to be very nice, quite a bit more polished than VE. We'll see how it goes, but I'm cautiously optimistic it will do what I need it to do.
Previously I've used the Eclipse Visual Editor plugin, which, while not perfect, generally gets the job done. To my dismay, the current release of the VE does not work with the current stable release of Eclipse, and there has not been an official release of the VE since June 2006. (So sadly neglected / and often ignored / a far second to Belgium / when going abroad / Finland, Finland, Finland :-) I hope this project gets reinvigorated at some point, but I wasn't going to sit on my hands waiting for that to happen.
A quick google search turned up Jigloo, an Eclipse-based GUI builder which, while not free software, is gratis for non-commercial use. From my 10 minutes or so of using it, it appears to be very nice, quite a bit more polished than VE. We'll see how it goes, but I'm cautiously optimistic it will do what I need it to do.
Tuesday, February 26, 2008
ArgoUML
In teaching a course on Software Engineering and Design this semester, I have rediscovered ArgoUML. Briefly, it's a UML design tool along the lines of Rational Rose or Visual Paradigm. I can briefly summarize its strengths:
ArgoUML's main drawback is the lack of an Undo feature, which is certainly a bit disconcerting. Web search hits of the ArgoUML development lists seem to indicate that this feature is in the works; if it gets done, then I my enthusiasm level would go from "pretty cool" to "KICK ASS". Even without Undo, it's still a good choice for occasional UML modeling needs.
- It works pretty well, especially for basic uses (creating UML class diagrams)
- It's free software (in the sense of both freedom and money)
- It's a pure Java application, and works well on any platform supported by Java
ArgoUML's main drawback is the lack of an Undo feature, which is certainly a bit disconcerting. Web search hits of the ArgoUML development lists seem to indicate that this feature is in the works; if it gets done, then I my enthusiasm level would go from "pretty cool" to "KICK ASS". Even without Undo, it's still a good choice for occasional UML modeling needs.
Wednesday, February 13, 2008
Tuesday, February 5, 2008
Java Software is a Good Thing
I'm teaching a course on Compiler Design, and I'm going to have students use JFlex and CUP as the scanner and parser generators.
I always dread asking students to use any software besides Visual Studio or Java/Eclipse, since it means I have to worry about whether or not
It occurred to me today that JFlex and CUP are both written in Java, so I could simply include them in the assignment skeleton! This took me all of about 5 minutes. Now I have a compiler assignment skeleton that requires only Java and Eclipse. In fact, it has an Ant script, so you don't even have to use Eclipse. So far I've only verified that it works on Linux, but I'm pretty confident that it will work on Windows, too.
JFlex and CUP are both free software, so there are no license issues to worry about.
I always dread asking students to use any software besides Visual Studio or Java/Eclipse, since it means I have to worry about whether or not
- they have it installed
- they have it installed correctly
- they have the right version
- etc.
It occurred to me today that JFlex and CUP are both written in Java, so I could simply include them in the assignment skeleton! This took me all of about 5 minutes. Now I have a compiler assignment skeleton that requires only Java and Eclipse. In fact, it has an Ant script, so you don't even have to use Eclipse. So far I've only verified that it works on Linux, but I'm pretty confident that it will work on Windows, too.
JFlex and CUP are both free software, so there are no license issues to worry about.
Monday, February 4, 2008
The Vague Syntax of Ruby and Ruby on Rails
I like the Ruby programming language a lot, and the Ruby on Rails web application framework is one of the best ones out there. One characteristic they share is an emphasis on writing concise code. Ruby pares down the syntax of writing object-oriented programs to a bare minimum. Rails emphasizes the use of a small number of conventions and idioms in order to avoid specification of all but the most essential details.
However, I think that both Ruby and Ruby on Rails take the principle of brevity to an unreasonable extreme. Here are a few examples.
First, Ruby (the language) does not require parentheses around conditions or method arguments. So, you can write
As an even simpler (and more ambiguous) example, say that you see this code in a Ruby method:
Rails code (at least in the books and on-line tutorials I have read) tends to opt for the same kind of extreme brevity. For example, consider the following code:
I guess that options hashes are good in the sense that unnecessary information can be omitted. However, I think options hashes are overused in Rails. An options hash is basically a "magic bag of goodies" that a method will use to carry out some behavior. However, the specification of the options hash at the call site does very little to inform the reader how the contents of the hash will influence the behavior of the called method. In the case above, it's reasonably clear that :action => :login will redirect to the login action. However, what is going on the :destination key? As far as I can tell, it simply puts request.request_uri in the query parameters of the redirected request, but I fail to see how that behavior is even hinted at in the text of the method call. Wouldn't something like the following be much clearer?
Sacrificing a bit of brevity in order to get self-documenting code seems like a good tradeoff to me.
However, I think that both Ruby and Ruby on Rails take the principle of brevity to an unreasonable extreme. Here are a few examples.
First, Ruby (the language) does not require parentheses around conditions or method arguments. So, you can write
instead offoo.bar baz, thud
In the second form, isn't it much more obvious that we're calling a method, and that baz and thud are the arguments?foo.bar( baz, thud )
As an even simpler (and more ambiguous) example, say that you see this code in a Ruby method:
A bare identifier does not really provide any clue that would suggest to the reader how the identifier is being used. In this case, it will be interpreted as a method call with no arguments. Wouldn't it be much more clear like this?blat
I think the general lack of visual cues in Ruby code makes it difficult to read.blat()
Rails code (at least in the books and on-line tutorials I have read) tends to opt for the same kind of extreme brevity. For example, consider the following code:
I found this code in an implementation of user authentication using something called Confluence4r. The code specifies what should happen when a privileged action is attempted without the proper credentials being present in the user's session. It's reasonably clear that a request is being redirected. However, an options hash is being used to specify the details of the redirection.redirect_to :action => :login, \
:destination => request.request_uri \
and return false
I guess that options hashes are good in the sense that unnecessary information can be omitted. However, I think options hashes are overused in Rails. An options hash is basically a "magic bag of goodies" that a method will use to carry out some behavior. However, the specification of the options hash at the call site does very little to inform the reader how the contents of the hash will influence the behavior of the called method. In the case above, it's reasonably clear that :action => :login will redirect to the login action. However, what is going on the :destination key? As far as I can tell, it simply puts request.request_uri in the query parameters of the redirected request, but I fail to see how that behavior is even hinted at in the text of the method call. Wouldn't something like the following be much clearer?
Sure, we replaced 1 line of code with 5, but the reader would have a much better chance of figuring out what is going on.next_request = Request.new()
next_request,set_action( :login )
next_request.add_param( :destination, request.request_uri() )
redirect_to( next_request )
return false
Sacrificing a bit of brevity in order to get self-documenting code seems like a good tradeoff to me.
Wednesday, January 9, 2008
Back to work!
I'm getting back to work after a very enjoyable holiday break. I was able to play a significant amount of Super Mario Galaxy while I was on vacation; it's a very good game, and (IMO) better than Super Mario Sunshine, but still not as good as Super Mario 64.
I'm teaching a Compilers course in the Spring, so I'm beginning to get things ready. Today I played around with JFlex and CUP, which are Java equivalents to the ubuquitous lex and yacc. As much as I enjoy the occasional bout of C hacking, Java is a much better teaching language. I put together a simple JFlex/CUP example that demonstrates integrating a JFlex lexer and a CUP parser. JFlex and CUP are designed to work together, so it wasn't a huge task.
Next task is to investigate using Jasmin to compile generated JVM bytecode.
My colleague Dave Babcock and I are working on a paper to submit to ITiCSE 2008. It will describe a nifty sequence of CS1 programming labs and assignments. Getting stuff published in CSEd conferences is always a crap shoot, so we'll see what happens.
WXPN radio started broadcasting in the York/Lancaster area in the Fall, and I've been enjoying it a great deal. One important musical discovery I made via XPN is Neko Case: her most recent album, Fox Confessor Brings The Flood, is one of the most brilliant things I've heard in a long time.
A major event is going take place in February...more later!
I'm teaching a Compilers course in the Spring, so I'm beginning to get things ready. Today I played around with JFlex and CUP, which are Java equivalents to the ubuquitous lex and yacc. As much as I enjoy the occasional bout of C hacking, Java is a much better teaching language. I put together a simple JFlex/CUP example that demonstrates integrating a JFlex lexer and a CUP parser. JFlex and CUP are designed to work together, so it wasn't a huge task.
Next task is to investigate using Jasmin to compile generated JVM bytecode.
My colleague Dave Babcock and I are working on a paper to submit to ITiCSE 2008. It will describe a nifty sequence of CS1 programming labs and assignments. Getting stuff published in CSEd conferences is always a crap shoot, so we'll see what happens.
WXPN radio started broadcasting in the York/Lancaster area in the Fall, and I've been enjoying it a great deal. One important musical discovery I made via XPN is Neko Case: her most recent album, Fox Confessor Brings The Flood, is one of the most brilliant things I've heard in a long time.
A major event is going take place in February...more later!
Subscribe to:
Posts (Atom)