Monday, 7 April 2014

Strange Java issue with reflection and statics

Yesterday I found myself struggling with an odd Java problem. I suspect it is a Java bug, though I have deeply, deeply ingrained resistance to blaming anything on my environment. I can hear university lecturers and early employers pointing out 'it is always you' when I naively suggested that possibility years ago. But this just may be the exception.

I'm using reflection to discover the data type of the first parameter on this method:

public void init(Email annotation, PropertyMetadata propertyMetadata)

in this case Email is actually an annotation rather than an ordinary class. It has a full type of

nz.co.senanque.validationengine.annotations.Email

But when I locate the init method using reflection and get the type of the first parameter like this:

Class p = (Class)method.getParameterTypes()[0];

the type I get is

java.lang.annotation.Annotation

rather than

nz.co.senanque.validationengine.annotations.Email

and this causes my validation engine to ignore validating email fields. I have other kinds of validation (eg Length) which work perfectly well, the type for Length is returned correctly no problem. So I went over the code to see what the difference was. The good news is I eventually found it, the bad news is it doesn't really make a lot of sense.

The init method shown above lives in a class called EmailValidator. Again, I have a list of these such as LengthValidator, RegexValidator and so on. They all look the same, except that EmailValidator misbehaves. The one difference is that EmailValidator has some static Strings defined. It looks like this:

public class EmailValidator implements FieldValidator
{
    private static String ATOM = "[^\\x00-\\x1F^\\(^\\)^\\<^\\>^\\@^\\,^\\;^\\:^\\\\^\\\"^\\.^\\[^\\]^\\s]";
    private static String DOMAIN = "(" + ATOM + "+(\\." + ATOM + "+)*";
    private static String IP_DOMAIN = "\\[[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\.[0-9]{1,3}\\]";
...


See? Just ordinary  static Strings. They don't seem to have any relationship to the init method. But they do. If I make them non-static (ie remove the 'static' from the definition) then... it all comes right. The datatype comes as

nz.co.senanque.validationengine.annotations.Email
 
as it should. There is no downside to making the statics into ordinary fields because these classes are singletons, instantiated only once. And, of course the validation now works. 

The result, when I edit '@' out of Amy's email address and tab off the field shows the error signal and disables the 'save' button, which is what is supposed to happen if there is an error. When I roll the mouse over the error signal it pops up with more detail. The totally cool thing about this is that all I need to do to add validation to the field is annotate it with @Email, everything else just happens (as long as those static fields are changed).

Friday, 4 April 2014

The End of Windows XP

Microsoft have announced they will stop supporting Windows XP around now. Apparently there is still a lot of it about. It was a pretty good operating system in that it was stable and didn't use too much resource. So people hung onto it (including Mrs).
It won't suddenly stop working, of course, but there won't be any new patches to plug security holes that inevitably get found and exploited. That's really what 'support' means. And I'm listening to a discussion on the radio about your options if you are still running it.
The obvious thing to do is pay a couple of hundred dollars to MS and get a copy of Windows 8, install that (it will be a clean install, not an upgrade) and learn to use a whole new interface. Except that Windows 8 probably won't run on your old computer because W8 needs more resources (memory, disk space, CPU). Your old 3rd party software will run okay though, probably, if there is enough resource.
What they didn't mention on the radio is another option: Linux.
Here's why you should consider Linux, specifically if you are running on an older machine you want to look at Lbuntu. Here is why:
  • It will run on your existing machine, you don't need to upgrade your hardware. I'm running it just fine on a machine I bought in 2005.
  • It looks enough like XP to make you feel comfortable enough, probably more like it that Windows 8 does.
  • It will probably run your existing 3rd party software. No guarantees, but Word and Excel run just fine under a compatibility product called Wine. Setup is simple.
  • It is free. This point would be unimportant if it did not work, but it does work, so I'll say it again FREE (as in beer). You just download it. You don't need your credit card.
  • Support is not going away and there is a vibrant community of helpful people. I've never yet had to ask anything, a quick Google always finds someone else with my question, and the answer.
Yes, you will have to do a complete re-install to get there, but you were up for that anyway. Linux is seriously simple to install these days, at least as easy as Windows and it happily auto-detects your hardware and sets it up without any fuss.
Lbuntu is one of many 'packaging' options for Linux and you can find lots of others, but Lbuntu was specifically designed to look more like XP to the user and to be lighter weight, ie to run on old machines.
You can even give it a quick try out. Download the CD image from here and burn it to a CD, then boot from the CD. The CD gives you the option of running stand-alone (without touching your disk drive, it is all in memory) or installing a system on your disk. Use option one to give it a try.
When you decide you like what you see you can install it alongside your Windows system but you are better off putting in a new disk and installing onto that. The reason I recommend this is so that you absolutely know that you aren't going to press the wrong button somewhere and overwrite you existing system.
You can then put your old drive into a USB enclosure and copy your data across to the new drive.
All this is no more hassle than installing Windows 8, far less if it saved you a hardware upgrade.

Wednesday, 19 March 2014

JMX, Tomcat and VisualVM

I've spent most of today wrestling with this and it ought to have been easier, but everywhere I looked for instructions had lots of steps I didn't need and almost all of them missed one vital step.

So what am I trying to do? Java has a feature called JMX which allows us to expose parts of our applications to the outside world for the purposes of monitoring and control. For example I have a small lock management system. Although it never goes wrong, and never will, of course, it seems prudent to expose a way for a sysadmin to go kill a lock that has been left in place by mistake. JMX exposes 'MBeans' which are essentially Java classes which have methods that JMX can let me call remotely.

The environment: Tomcat 7, JDK7, VisualVM (bundled with JDK)

Tomcat is my app server and it contains my application code. It has JMX services built in. VisualVM is my client, it gives me a UI that I can do the monitor/control stuff from. In addition I am using Spring 3.2.6 in my application because it has code to simplify exposing the MBeans.

Spring used to have a separate module for JMX called spring-jmx, but I noticed that has not been updated since version 2. They've rolled the JMX code into spring-context. I already have that library in my maven dependencies so that's fine.

I added the following code to my Spring configuration file:

<bean id="simpleLockerJMX" class="nz.co.senanque.locking.simple.SimpleLockerJMX" />

<bean class="org.springframework.jmx.export.MBeanExporter"
        lazy-init="false"
>

   <property name="beans">
     <map>
     
<entry key="bean:name=simpleLockerJMX" 

        value-ref="simpleLockerJMX" />
   
</map >
  
</property ></bean>



The first bean: simpleLockerJMX is the MBean I want to expose through JMX. The second one is the way to tell Spring this is what I want to do. They do make this very easy. The simpleLockerJMX bean doesn't need to know it is an MBean, it is just a simple Java class. There are many, many posts around to make this more complicated, including in the Spring docs, but all I need here is enough to prove the concept, and this works. It is, I think, limited to the local machine and has no security (other than being limited to the local machine, of course). Those options can be added if you want more complexity.

Now to tell Tomcat we want it to do JMX. This is done by adding these to the catalina.sh file:

 -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.port=8090 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.hostname=localhost

They get added to the CATALINA_OPTS argument. Adjust your port to one that is available on your system. If you're running Tomcat under Eclipse then add those entries to the VM Arguments in the arguments tab in the tomcat launcher. Also edit your tomcat-users.xml file to grant one of your users the role: manager-jmx

Now you can start Tomcat and it should be exposing the beans. You can check by logging into the Manager app in Tomcat (which is not there if you're running under Eclipse, so start it stand-alone for that step) and there's a way to display the exposed beans. This is using an internal view of things though, not quite the same as exposing them to the outside. Here is what you do:
  1. Browse to http://localhost:8080/manager/jmxproxy/
  2. log in as the user who has manager-jmx
You should see a fairly crude dump of the exposed MBeans, including the new one.

Next step is to start VisualVM. This is a utility that comes with Java. There used to be a similar utility called jconsole that VisualVM supersedes. Run it by typing jvisualvm on a command line.

The first thing you must do once you start VisualVM is install the MBean plugin. I wasted hours by missing this step. Go to Tools..Plugins then the Available Plugins Tab and check the VisualVM-MBeans plugin and follow the short install procedure.

Then you should see something like this:

There are more methods exposed on the class than I'd like but we are definitely seeing it in VisualVM so I call it working.

To reduce the excess exposure of the MBean I added Java attributes like this:

@ManagedResource(objectName = "nz.co.senanque.locking:name=simpleLocker", 
description = "manager for Simple Lock Factory")
public class SimpleLockerJMX {

    @Autowired private SimpleLockFactory m_simpleLocker;
    public SimpleLockFactory getSimpleLocker() {
        return m_simpleLocker;
    }
    public void SimpleLockerFactory(SimpleLockFactory simpleLocker) {
        m_simpleLocker = simpleLocker;
    }
    @ManagedAttribute(description="The current locks")
    public String getDisplayLocks() {
        return m_simpleLocker.toString();
    }
    public void setDisplayLocks(String s) {
        return ;
    }
    @ManagedOperation
    @ManagedOperationParameters ({
        @ManagedOperationParameter(

         description="Name of the lock to kill", name="lockName")  
    })
    public void killLock(String lockName){
        m_simpleLocker.unlock(lockName);
    }
    public void setSimpleLocker(SimpleLockFactory simpleLocker) {
        m_simpleLocker = simpleLocker;
    }
}

This bean just delegates to another bean, the SimpleLockerFactory, to do what we want. That bean is not an MBean, though it is a Spring bean, so it is not exposed through JMX. I added the @Managed... attributes to the SimpleLockerJMX I had before and changed the Spring configuration a little:
    <bean id="simpleLockerJMX"
     class="nz.co.senanque.locking.simple.SimpleLockerJMX" />
    <context:mbean-export/>

And that is all I need. The result in VisualVM is just getDisplayLocks and killLock are visible, which is what I want. If I annotate any other classes like this one they will be picked up automatically by Spring. It does mean I have Spring annotations in my Java, which means it is dependent on Spring, but I have Spring dependencies in other places already.

This was all just fine until I added another JMX bean. It looks more or less like the one above, but with a different delegation bean injected. And that doesn't work at all. The stack trace is confusing but it seems to be a problem with Spring's initialization. It is as if the MBean exporter requires all dependent beans to be initialized before the MBean can be completed. In this case they aren't, though it isn't clear to me why. I got around it by deferring the injection of the dependent bean. The code looks like this:
    @ManagedOperation
    public boolean isFrozen() {
        return getExecutor().isFrozen();
    }
    private Executor getExecutor() {
        if (m_executor == null) {
            m_executor = (Executor)m_beanFactory.getBean("executor");
        }
        return m_executor;
    }

This makes the class even more dependent on Spring, of course. But it does work.

Wednesday, 12 March 2014

Ash Wednesday in the South

Ash Wednesday has just come and gone again this year. It is one of those things that us Anglicans have a mixed commitment to. Some of us do the whole ashes on the forehead thing as shown, some of us let it go by, possibly some of us wonder what the point is.

For those of you who don't know, Ash Wednesday marks the beginning of Lent which is a time we're supposed to hold back on the material things of life and take some time to build up our spiritual side so we can more appreciate Easter which comes at the end of Lent. Anglicans don't have strict rules around what we do in Lent, so some of us get into fasting etc and some of us don't and no one minds. The ashes-on-the-forehead are supposed to come from the little palm crosses they pass around on Palm Sunday (Sunday before Easter) which we've kept all year and they get burned on Ash Wednesday. I knew this well as a kid, but I never did manage to keep track of my palm cross for a whole year. Somehow by next Ash Wednesday it had gone missing. Still, it is a nice idea if you can do it.

There's a point to it all as well, the ashes are a reminder that we are all going to die (ashes-to-ashes) so this life is all temporary. In pre-Christian times Roman generals while enjoying their triumphal march into Rome after victories in far off lands would have someone next to them whispering from time to time 'remember one day you will die'. It was to keep them from getting too up themselves. There may be a connection with the ashes, though maybe not.

In the old days in the North the subsistence farmers were nearing the end of winter and their stores were getting low. It was a really good time to cut back on food, but they knew spring was not far off and something to look forward to. Weaving this notion into the Christian story made good sense even though, it should be noted, the Christian story doesn't quite fit it. Jesus fasted for 40 days which is supposed to be the Lent period, but he did it before he began his ministry, not three years later just before he was crucified. But they worked with what they had and made an annual cycle out of it all. It was probably helpful to have spiritual leaders prompting people to eke out their stores for the last of the lean season. Mardi Gras, or Carnivale, which is just before Ash Wednesday may be connected with the idea that there would not be enough food for any excess livestock in the next couple of months so best to eat them now. Carnivale and carnivore are closely related words.

Here in the southern hemisphere the whole Lent thing is awkward. We're in our abundant season. Fruit is falling from our trees uneaten because we can't keep up with it. I feel the need to just go out there and munch down a few more pears and apples and plums and figs and... well it would stop them going to waste. Sure, we preserve stuff, and even give some away, but it is still hard to keep up with. Tightening my belt around now just doesn't seem right.

There are aspects of Lent that do not involve food, but I like to think I do my share of those the rest of the time, prayer and kindness and so on. So Lent does sort of pass me by usually. Possibly there would be value in doing something Lenten in six months time, around September, but in our climate that is when spring is well underway and we're getting the first asparagus. Possibly July, which is really winter, would make more sense.

So I don't really worry about Lent too much, and I enjoy Easter when it comes.

Sunday, 26 January 2014

One Smart Cop

This morning we saw an interesting catch by a traffic cop. It's a long weekend here and there are lots of cops around. Just south of where we live is a longish 80km/h section (usual open road limit is 100km/h). It's slower because there are several sneaky corners interspersed with tempting straight sections. Our car has a speed limiter on it and we make use of that to keep to the limits. This morning Mrs was driving with the limiter on, and a car behind tailgating 'cos he wanted to go faster, which is not unusual through there.

We passed a cop parked on the side of the road, no doubt with radar on etc (also not unusual through there). The tailgater pulled back when he saw the cop, of course, and we both passed him looking quite sedate.

Mrs noticed the cop pull out just after we passed and checked her speed just in case, it was fine. Once we were around the next corner the tailgater passed us. Shortly after that the cop passed us at speed. We wondered why. He didn't have his lights flashing and we made irreverent jokes about him being late for his tea break etc.

A little later we found him with his flashers on stopped behind the tailgater and with his pad out etc. Well, we knew the tailgater wanted to go faster than us, and so it seems did the cop. Presumably the cop had spotted him tailgating before he had a chance to pull back.

One smart cop. I wonder how often they use that trick?

Tuesday, 21 January 2014

On Reading a Book

I'm reading a book just now, or trying to. When I say a 'book' I mean a book made of paper, not an eBook. This is the kind of book people talk about when they wax eloquent about the joy of real books. It is 'London: The Autobiography' by Peter Ackroyd and it is a fine work. The cover is interesting, the paper is good quality and the binding is well done. It needs to be well bound because it is a thick book and fairly heavy. If it were new it would probably have a 'new book smell'.

The writing is excellent and the material is riveting. So there is every reason for me to be racing through this book.

And yet I am not, and I found myself wondering why.

I often read at breakfast. It is a good time to catch up on reading my weekly 'New Scientist' and my brain appreciates the warm up before the day really starts. But I cannot read this book at breakfast. It wants to flip closed all the time and it takes one hand to hold it open and two hands to turn a page. I need between one and two hands to eat so it doesn't work. The second problem is that there is a good chance I will spill something on it, especially when I'm struggling to hold it open and eat at the same time.

These aren't issues with magazines like New Scientist because they lie flat and they are ephemeral enough that the odd bit of egg or cereal landing on them doesn't matter. I also often pick up my tablet (iPad Mini) and check the newspapers. Again, I can work that with one hand and food spills wipe off without damage.

The other time I read is in bed before I put the light out. There's no food and I have both hands free. But propping up a heavy book gets a bit wearing and if Mrs wants to put the light out sooner than I do then we have to compromise. The tablet wins out there as well. It is not nearly as heavy as the book, and I can read it in the dark. I often wake up early and I can read in the dark before Mrs wakes.

I do rather like nice books, and I have a fair collection of them. But as for actually reading them, the tablet seems to do a better job. And books that are less than nice, such as cheap paperbacks, they come a poor third.

Saturday, 14 December 2013

Maven (not the space probe)

Until recently I had always used ant and ivy to manage building my Java projects. It works well except for one really annoying thing. Actually maybe it is several. When you have a team of programmers maintaining ant scripts it is easy for standards to drift, various projects have 'special needs' in their builds and that has to be catered for. Sometimes people just create projects with a different directory structure because they feel like it. I'm all for freedom of choice in these matters, except when it is late at night with a delivery in the morning and I am trying to work out someone's odd build arrangement that isn't working. Then my inner fascist surfaces and I wonder if we couldn't do better.

So I moved all (um, nearly all) my projects to use an included ant file containing all the standard targets we need for a build and the builds just have to conform to those. That worked quite well, then the really annoying thing turned up.

I wanted the builds to be easy to get started, so you can pull the project from source control and just run the build. I'm assuming ant is already installed, but I'm not assuming anything else. What about ivy? Okay, my included file looks for an ivy directory and pulls the ivy jar file if it isn't there. Ivy pulls everything else.

What about the include file? Well that has to be committed into the project. But it is the *same* include file in every project, committing multiple times violates DRY. Yuck. Plus when I change it I have to change it in lots of places, well that's why we say DRY, isn't it?

You've already seen the title so you know where I'm going. I thought I would try maven. Maven uses plugins to do those targets the include file delivers but, because plugins are just jar files they can be managed as dependencies, so maven will pull whatever plugins I specify from the repository. They've pretty much all been written already and I just have to conform to the standards they dictate. That means so does everyone else and we all understand they layout etc of our projects and how they are built. Great. Even better, you just have to have maven installed then any build you pull can be just run by invoking maven. No need to do tricky things with a dependency manager. The entire build+dependencies for the project is defined in a pom.xml file.

How does this work in practice?

Quite well, better than I expected in fact. But there are a some tricky bits that weren't immediately obvious.

First, all my builds use MaduraDocs which was basically an xsd to define the format, a couple of xsl files and a load of ant files to pull it together. Every build uses MaduraDocs. So it was immediately clear this would have to be a maven plugin. That was good. Rewriting the scripts into Java made them simpler and I could make use of some OO techniques to make things more flexible. I ran into one issue there: how does a project build when it depends on a plugin that is itself. MaduraDocs depends on the MaduraDocs plugin. Well, I cheated a little. I invoke the MaduraDocs code from one of the tests and that builds the documentation for MaduraDocs itself, rather than invoking the plugin.

The next challenge was MaduraObjects. MaduraObjects is a JAXB plugin. Not to be confused with a maven plugin. Though, and here's the tricky bit, there is a maven plugin for JAXB called maven-jaxb22-plugin. maven-jaxb22-plugin  accepts JAXB plugins and MaduraObjects is one of those. My initial approach was to simply add maven-jaxb22-plugin to my project and have it run, invoking the MaduraObjects code. Except it doesn't. For some reason I did not figure out it refused to find the MaduraObjects code so I had to try another approach. Again I fell back on tests. I wrote a test that invokes maven-jaxb22-plugin making sure the MaduraObjects code is on the classpath and that works. It results in some generated Java which then needs to be recognised by the phase that compiles the test code. To do that I use the build-helper-maven-plugin which can tell maven to include another directory in with the test code. If you use the build-helper-maven-plugin just right the Eclipse maven plugin uses it to recognise the generated source directory and adds it to the project classpath, that avoids various red Xs in Eclipse.

MaduraBundles is even more fun. The test code requires three small jar files to be built and loaded into a sweep directory. MaduraBundles loads small jar files as sub-applications and the tests have to try that out. Maven is *very* keen on the 'one project one output' approach and this project doesn't actually violate that, it just needs to generate some test jars. This is the most complex and the oddest of my projects. It works and is not so very complicated, but I suspect this is a rare case of extreme complexity.

Finally MaduraRules needed another maven plugin and this one generates source files, so it has the same issue that MaduraDocs has in that it must avoid invoking itself, and it needs to have the generated code added to the test code for compiling. In practice the code that does the work is invoked by tests in the MaduraRules project and the plugin is a simple wrapper to that code placed in a separate project.

These tricky bits took some working out but overall it ends up simpler, I think. One of the things that made me sceptical of maven initially was how I would handle the various configurations I was using in ant+ivy. When using Ivy I normally had configurations for test, compile, package, build and so on. I could define whatever ones I wanted and just use them. I could pull all the dependencies of, say, the build configuration, into a directory and use that as the classpath for the build. Specifically I would use this for invoking JAXB to generate Java from an XSD file or to generate rules from a rules file. I knew that maven had more strictness around this with a specific set of phases that, while it could be extended, was best left alone. How would I manage my builds?

The answer is delightfully simple. Because plugins can have dependencies and those dependencies are managed by maven directly you don't need a build configuration, you just specify the plugin you want. The dependencies of the plugin are distinct from dependencies of the project itself. So I don't need a build configuration anymore, nor do I have to write scripts to pull the build dependencies into a directory. As for compile, test and package there are equivalents in maven. Test dependencies are only used in the test phase, compile dependencies are everything the project needs to compile against and these are always packaged as well. There is sometimes a case where the project needs to compile against something but not package it, in this case you tell maven it is 'provided'.

There are some more things in maven that I haven't had to look at closely. It seems the pom.xml file can specify a parent to inherit, which means if you have a lot of common stuff in your pom files you can consolidate that into one file and inherit it. Superficially this looks like the included ant file I mentioned earlier but there is a key difference. The parent pom file can be kept in the repository and fetched by the maven dependency manager. That means you don't have to violate DRY, and you still don't have to jump through hoops to get the parent file into your project.

So for me the maven builds turn out generally simpler, more consistent and there is no violation of DRY.