title | image |
---|---|
Scheduled Tasks |
../images/batch.jpg |
Tapestry provides an API for running tasks in background threads, but the programing context is a bit different than the typical page request. As seen when sending email, it's easy enough to use a webpage as a webservice, providing the same programing environment and debugging tools as any other page. Creating a scheduled task is just a matter of calling the page from a scheduled thread. To make things, simple the jobs will go in a scheduledtask package under pages. Each job will have an @Schedule annotation and an @RequiresRole annotation for access control.
For the Hotel booking app, it might be nice to have a report containing today's arriving guests. Since this report contains PII data, the task will create a report for each hotel and email a link if there are any guests arriving that day. The code is simple and since it's just a web page, it's easy to test, except the code sends an email every time it's run.
It's easy to create a link to the report page here, but a bit more difficult in a background task. Most users appreciate testing but don't really want to be part of it. It's possible to add some code to avoid sending the email when testing, but there is a better way.
Tapestry provides the ability to load additional IOC configurations based on the mode. For development, it's possible to override the email service with one that just logs. Creating the service is easy -- just implement the email service interface, add a logger and call it. In the development module, just override the existing email service and test away.
The report page is also easy -- just run the query and show the results in a grid. This page needs to be protected in two ways. First only Hotel owners should be able to access it. Secondly owners should only be able to see reports for their hotels. By now, it's becoming obvious there is a need for a permission system. Before heading too far down the invent one on the fly, it's time to look at existing options. Apache Shiro has been around for a while and there is a Tapestry module that supports it.
To run the jobs, a service needs to go thru all the pages and add a job to the PeriodicExecutor. The service for this is easy enough but needs to run on startup. That's accomplished by adding @EagerLoad to the service. The service just goes thru the pages in the package, grabs the info form the @Schedule annotation and calls addJob. The return from addJob is stored in a map and along with the last run status is made available for use by a status page. Two status pages are needed one for the list of jobs and one for the job details.
This makes for a nice, simple scheduling tool. It's tempting to add more features, but there is a better way. Jenkins provides a way to run jobs on a schedule, email status, track history etc. Installing Jenkins is easy, since it's available as an application or war file.
There are a few ways to construct a page to run a job. A simple way is to just create a normal java/tml file. That works, but turns out to not be very convenient. First, it's difficult for Jenkins to know if the job worked or not. While the HTML output works ok for successful jobs, it's not great when they fail. Besides, Jenkins prefers text output. What's really needed is some kind of logging mechanism, perhaps with various levels and output formats.
Fortunately there is one already included in Tapestry. Injecting a Logger does exactly that. The trick is to make the logger output become the page response. Log4j has a socket appender that writes logging events to a server. The server will be a Tapestry service and the logs will be stored by page. First add the socket appender to the Log4j properties file.
The Page object will represent a request. The one trick here is the logs are added asynchronously so the List that stores them needs to handle this.
Next the page needs to be created for each request. Tapestry contains an example of this in the page timing filter. To keep track of the pages, a sequence number is created with AtomicInteger and added to the Logger with MDC.put, which adds name/value pairs to the logging event.
Lastly, we need a service to coordinate all this. First is the log server. This will need to run in its own thread, so create a Runnable class. The server is pretty simple. It opens a socket, accepts a connection then reads logging events and adds them to pages as they come in.
The next bit is running the job and returning the logs. This requires a method that takes a Runnable and returns a StreamResponse. The idea is try the run method, which will throw an exception if something goes wrong. The sleep is there to give the logs a chance to succeed. The rest is a simple StreamResponse that gathers up the data, sets the headers and returns the result.