Monday, September 27, 2010

Persistence in Java

This morning, amid the thousands of other things I'm trying to get done, I decided to formalize my understanding of and opinions on the Java Persistence API (JPA). This is the subject of a new article, Companion Notes on the Java Persistence API and using EclipseLink with Apache Derby at Java Hot Chocolate. I piggy-backed my comments atop an existing tutorial, one of the many fine from among those published by Lars Vogel.

I discuss a few subjects that go beyond the scope of the original tutorial. I note that, while this is a pretty light framework, it's heavier than simply using XStream and the file system which can more easily be done depending on how complex your persistence needs are.

I also discuss the dismal absence of a solution to allow POJOs to evolve respective to this framework and point out that the @Version annotation, whose existence might at first make you think there is such a solution, is really an unrelated, optimistic locking mechanism in JPA.

What's the solution to issuing subsequent updates of your application in which you've made schema changes? There isn't one although I can think of ways to implement one depending on whether you're willing to go outside the framework and test subliminal version references you sprinkle into the POJOs yourself.

It's easy to observe the SQL statements used by JPA via a setting in the metadata file, so no experimentation is necessary to figure out how to go around it. Could you attach triggers to sort some of this out if you wanted to be totally clever (and perhaps a bit obfuscative) in your code to avoid adding too much to your Java corpus?

Each schema class is its own entity and, in the end, each modified class is basically a totally new entity. This is what it comes down to. I've worked on products that couldn't move forward for their schema being so much an anchor around their neck, so I'm interested in having a ready solution next time.

Friday, September 24, 2010

Ah - ha: I've been a victim of bad advice!

I am freshly disabused of a wide-spread piece of ill advice that I now dare expose.

When I left Novell and entered the job market as a candidate years ago, I learned a number of skills for job-seeking. The one I'm thinking about today is the resume. (If you've been following my blog at all, you know that I've been looking casually, then more intensively for another position since earlier this summer.)

Like everyone else in the software industry, the more years I worked and the more places I worked at, the longer my resume grew. It had disturbingly reached about 4 pages already back then. (Almost sounds like a line out of the mouth of Jacob Marley, doesn't it?)

A veteran of twice attending the LDS Employment workshop, I learned that your resume must not exceed a page or two and, anyway, no one will ever read even to the bottom of the first page before making a decision on whether to consign it to the "round file" or keep it in the list of people to consider.

This makes a great deal of sense to me. Human Resources people process incredible numbers of resumes and can't afford to become students of the life history of everyone applying with their company (times the number of job postings for that company).

Fair enough then, the resume must be short.

Second, the resume's content, like the cover letter, must hook the reader with the idea that you are at very least one of a short, few good candidates for the job. Resolved: the resume must be carefully composed.

However, that's where the resume advice currently propounded by modern would-be employment advisors stops. These would have you eliminate the traditional list of places worked, activities, associated skills and objectives accomplished in favor of a more succinct, "Here's what I can do for you" evoking your entire work history without exploring any of it.

That idea of a short, killer resume is seductive, but erroneous, particularly in my industry.

The truth is that even assuming your resume gets read, you'll be rejected no matter how "cool" it is if the hiring manager or even HR person is unable to get a feeling for who you are and where you've been.

Now, I admit that this may just be my industry. For a teenager vying for his third job as night manager at Pizza Hut, the results-oriented resume may be better. However, I've spent a great deal of effort hob-nobbing with software recruiters, HR folk and engineering managers. To a man (or woman), each has insisted on getting my traditional resume.

In the last few days, one of them took pity upon me cued by an oblique comment I made on the telephone to the effect that I kept having to ante up my old-format resume when I'd been told never to do that. He enlightened me a great deal and I'm relating some of that here.

Merely listing a skill set and technologies employed is meaningless to the resume reader. It's crucially important that your skills be named it is true.

I'm also told that the cover letter gets the resume read or left in the stack (no argument from LDS on that point). And I'm told that the list of skills and technologies performs the same function, i.e.: it keeps the resume reader from tossing the resume into the trash. So far, so good and...

(A parenthesis: To this end, the skills must be listed clearly. The human eye can sort out mistakes at this point, but if the manager is using a query to search a database like Monster, Dice or his company's that comes from a blind, robotic extraction mechanism, he can't be bothered with trying out Java AND Java/JEE AND Java/J2EE, etc. He's only going to try Java AND J2EE or JEE, etc. If you've put "Java/JEE" it's unclear that both "Java" and "JEE" will come out of it: you're at the mercy of the resume parser on that point and you didn't get to write it. You'll probably fair better with something like "Java, JEE (J2EE)" and "Linux (UNIX)", etc.)

But, I'm also told that the engineering manager wants to know "how much skill" one has and "how much acquaintance" one has with the technologies listed. "Extensive experience with relational databases" goes nowhere. What the hiring manager wants to know before going to the trouble and spending the time to interview you is how you gained the skills and the technological familiarity.

The engineering manager figures this out by undertaking an examination of where you've been and what you've been doing most recently. Therefore, it's crucial to say what you've done and how you've done it in order for him to gain an idea of how deep the skills run.

This is an important point: say what technologies and skills were used company by company and project by project. In this way, the manager knows he wants to speak with you. If it's unclear, you're relying on having piqued his interest enough and there not being an adequate number of other candidates.

Another person I'm in contact with actually works occasionally for LDS Employment (I shan't give out his name). He's required to toe the party line on this point even though he knows full well that the advice is unsound or, at least, misapplied when it comes to the software industry. So he's quiet on this point during his "volunteer" life.

In summary...
The resume must be virtually short with content to grab the attention of a) the HR person who is probably not very technical, but has a list of keywords to search on, and b) the engineering manager who wants meat to eat, expects a seasoned professional to have been around the block and have something to show or say about it (i.e.: four pages or more).

If the manager is interested, he'll want to save time by delving into the short statements about each place worked and be able to weigh in his mind the likelihood that the whole resume adds up to someone he wants to go to the trouble to interview. The length of a traditional resume gives him that capability.

Unless your name is Mahatma Ghandi, a one-page resume with a few really powerful statements about you will probably not suffice. Software engineers can only rarely cough up some heart-grabbing assertions such as "increased sales through field leadership by 80% year-over-year" or even "saved three companies' payroll departments 46% in the first quarter after shipping a second, refactored version of the software."

Tuesday, September 21, 2010

Smartphone wars: our side is winning!

I knew that sales of Android phones were already surpassing those of Apple's iPhone, but I had no idea that actual Android marketshare had overtaken all other types of smartphones!

Looks like I'm on the winning side. Well, okay, I'm on the winning side to the extent that I fancy myself an Android developer, which I've become of late, having a lot of spare time left over from my present employment debacle as Emerson Network Power closes down Avocent's offices in Salt Lake City. So very sad; but, as I say, an opportunity to learn something new and exciting!

If you've been following, I've also dabbled a wee bit in BlackBerry. (Be warned: I lapse into the arcane for the next few paragraphs.)

That platform is a great deal less exciting than Android because the Java support is constrained to what's referred to as J2ME or Java Platform, Micro Edition. It is super-restrictive, so tiny, there is a great deal missing from it that an ordinary Java programmer finds important. In my case, I need to write an XML parser that runs on BlackBerry because (so far) I haven't found one available. Not a huge task if I settle for just the functionality I need, which is very slight, but a difficult challenge without Java reflection, something missing from J2ME:
import java.lang.reflect.Method;
private void doThis( Class c )
    Method[] methods = c.getMethods();

Sorry, mate: can't possibly do that!

Rolling separate code in the form of a library (JAR) into a BlackBerry application is another mountain I've not succeeded in climbing yet either. It appears to be a black art. Precious little of any code I write for Android will work on the poorer platform and none of it in the form of an external JAR.

Moral: buying BlackBerry is like buying a car sporting a hand crank, spark-advance control and a manual choke in our day.

Anyway, I'm having lots of fun on Android doing some cool things that seem to work well and I'll have an application out in distribution soon. Which is not to say that it will make me any money. It's a free part of a bigger product and money will only come in based on many other factors. Alas, there the planets and stars may never align. Nevertheless, the journey is the reward and I will not have sunk my entire life into it.

Monday, September 20, 2010

Broad experience: the browser plug-in

My nephew, Richard, rather amazes me by the breadth of his experience sometimes. I think true geeks grow up "broad" now days. Many of the rest of us have had careers in very vertical pursuits sometimes and breadth has come to us more especially at the moments of technological shift, from ALGOL and Fortran to C, from C to C++, from C++ to Java or C#.

Anyway, among the myriad things he's done in his as-yet short career of PHP and web work, streaming video and a few other undertakings is an open source project that grew out of experience gained at Move Networks to enable rapid development and easy maintenance of browser plug-ins written for just about any platform. Move is a now-struggling concern with some pretty super streaming video technology whose future is a bit uncertain to say the least at this point. Browser hosting of video streams was the name of the game.

The project is called FireBreath, but that doesn't so much tell the tale. Check out A Year in the Life of an Open-source Project.

Browser extension vs. plug-in

While we're at it, let's note that there is some confusion in the space between browser extensions and browser plug-ins.

The first is code extending the browser, usually to give it the ability to do rather sweeping things like debugging. I use something called FireBug with my FireFox browser to help me find trouble in my web pages or to inspect other web pages when there's something cool in them I haven't seen before or want to know how to do.

It would be a real stretch to unify the development of browser extensions across multiple browsers and platforms.

The second is a bit more conservative undertaking: the plug-in affects pretty much just the web page being viewed (and not the browser application displaying the web page) and is often brought in via the <embed> tag in HTML I have much experience with, and more modernly, the <object> tag.

An example of this would be something that plays MIDI files or sets up for viewing Windows Media files, both things I do in some of my pages. Apple's Quicktime player and Macromedia's Flash are two more examples of browser plug-ins. I also use temperature converters on my cooking site, from that I helped design, and a calendar on other pages of mine (also from poodwaddle) that are plug-ins. You may remember having to install such things before gaining the full user experience of some web sites.

In summary, we've been talking about Richard's plug-in framework, FireBreath, and not about browser extensions.

Sunday, September 12, 2010

Irony drips...

Because of the sorts of things I do on Linux, mostly work related to software development, I have little need ever to print there. For years I've relegated printing, something I do very little of anyway, to whatever box happened to be my primary Windoz computer host. (I always keep a Windoz box alive for my personal use in order to use software I like that won't run on Linux such as PaintShop Pro 7 even though I know that there are solutions to doing that from Linux too.)

Since I acquired a new Windows 7 Professional 64-bit platform late last year, printing has been a largely unworkable solution. At first, in fact, it seemed to work perfectly well, but about the time I lost my motherboard and replaced it, then found I had to reinstall from scratch, it stopped working. While I didn't detect any anomaly during re-installation and resurrection of my data that I had carefully backed up for the most part, nevertheless, printing was thereafter very hit-and-miss. In fact, I had only printed one or two pages just to try things out, so I don't know for certain that it ever worked permanently and well.

How the mighty one has fallen!
Usually, things went like this: Plug my Hewlett-Packard 5550 into a USB port, note that Windows loaded the driver, then print something. Early on, this often worked the first time only to stop working the next time I tried to print. In frustration, and because my office has been in an awkward flux since the first of the year, I'd unplug the USB cable and forget about it for a week or two or three. However, very soon, it would stop working at all.

Of late, I've pulled my hair out over this printer and my Windows box no longer able even to get them to work together a single, initial time. Incidentally, I can't get this box to support my built-in multi-card reader either, nor my external reader. And Google tells me I am not alone by far in my observation: Windows 7 doesn't reliably support printers or card readers.

Decidedly, not only does Windows have its usual troubles with inconsistent interfaces, but since the last version of Windows that worked (XP) in true plug-and-play fashion, its utility has sunk very low indeed.

I mean I'll grant you that I'm an idiot, but so's your grandmother. And yet, until Windows Vista, even she could plug in her new printer or card reader and immediately get a working peripheral with no need to Google to find out how to overcome a lack of support for such common devices. It just worked. It no longer does.

So, that irony I was speaking about...
I grew up under the UNIX operating system in my early career. Configuring a system was a pretty hard thing to do. Even after years getting used to the ease of Linux in doing most things, I continue to be surprised by it. Such was the case this morning.

I really needed to print out a recipe in order easily to take notes on it later today because I'm going to present this recipe to a formal gathering in a local theater. I'm making this dish today. Annoyed at the prospect of spending fruitless hours messing about with getting my printer working on Windows 7, I decided I might be ahead learning to get it running on Linux.

From my UNIX years, I have a knee-jerk expectation that it's not going to be straight-forward, so I cast around up front on the web for some help. Not finding very recent articles on how to get it running (reading out-of-date articles on Linux can be an exercise in frustration as the myriad distros have progressed very rapidly), I gave up and just plugged the #$*@ thing in. A few seconds later, a notice popped up on my desktop announcing my printer by (accurate) name and claiming that it was set up and ready to go. I'm not one to be fooled by such a cheap trick, so I put it to the test, brought my recipe up in Firefox, then printed it. What to my wond'ring, but grateful eyes...

Of course, this dripping irony as I call it is of my own making: I should henceforth believe that Linux can indeed do everything. And, I should turn my back on Windows forever. But I won't. I will still keep my foot in the door out of some sense of misguided interest. And I will continue to snipe and complain about Windows as it falls from utility.

Hehehe, now I'm going to attach my card reader someday soon—another peripheral I've always and only consumed from Windows.

Friday, September 3, 2010

Amy's grand day out!

I got a call this morning around 1030 announcing that Amy had been in a multi-vehicle accident an hour or so before. She was transported to St. Francis Medical Center in Grand Island, Nebraska.

Amy's health
Amy is in stable condition, very sore, scratched and bruised all over with particular trauma to her knees and chest. It's like someone beat her mercilessly with a rubber hose in a dark alley. She has two cracked ribs. By all rights, those should have broken and punctured her heart or lungs. She has been able to get up more or less by herself to shower, so that's a good sign, right?

Details of the accident
It happened on Interstate 80 south of Shelton, Nebraska, about 25 miles west of Grand Island. She came upon a car that was stopped, veered to avoid it, then found herself embedded in an 18-wheeler. She crossed the median into the truck hitting it more or less head-on. She had to be cut out of her vehicle. She was driving a 2006 Chevrolet Malibu belonging to her employer, KHAS-TV, a television station.

Her boss and one or two coworkers went to the scene soon after the accident and described the vehicle to Julene as "little left to show it had been a car," so that fact and given the apparent little injury to Amy, it is nothing short of a miracle. It was probably a very good thing this happened in KHAS-TV's vehicle instead of hers. And not just because hers would be gone. The station's vehicle was newer, safer and probably bigger.

Julene's trip out
Coincidentally, Julene had planned a trip out there next week; those plans were accelerated a few days. We looked for flights, but from Salt Lake, it's literally impossible to reach Amy faster than it takes just to drive it. So, even if we'd bought a ticket, she couldn't possibly have reached Amy until Saturday evening.

From Provo to Grand Island is 800 miles and dust, 95% of it on Interstate 80. Google Maps says about 13 hours. The major points of itinerary are Evanston, Rock Springs, Rawlins. Laramie, Cheyenne, North Platte, then Grand Island. She took a Garmin GPS, her cell phone and my iPod. She drove our little Mitsubishi Galant, which recently passed inspection, an oil change today, new tires and windshield a month ago.

Saturday, 4 September
Julene reached St. Francis Medical Center in Grand Island around 1400 Central. (I've updated Amy's health higher up.) Julene tells me that everyone acquainted with the details of the accident says nobody survives that sort of thing. It is being proclaimed a miracle.

Everyone from her work is sending her flowers and stuffed animals. Her room is full of them.

Sunday, 5 September
Amy was released today. Julene will be staying with her a few days while she gets better.

Apache Tomcat configuration explained

In an echo of "better late than never" and a post a couple of weeks ago, I have released a short article detailing the workings of that erstwhile bane of my existence, web.xml.

The impetus for this was the need a few days ago to define two separate RESTful servlets I was comparing in the same server I'm writing.

It's gratifying when one looks back to realize that the scales have completely fallen from one's eyes.