Thursday, February 28, 2013

Find Java Class version

The command javap is used to find out information about the class including the major and minor version number however to quickly determine in Linux and OSX the java class version (the JDK version used to compile it) just use the file command. Here is a JDK6 compiled class in OSX
$ file MyClass.class
MyClass.class: compiled Java class data, version 50.0 (Java 1.6)
Here is a JDK7 compiled class on Linux:
$ file MyClass.class
MyClass.class: compiled Java class data, version 51.0
As a reminder here are the available versions as of today:
major  minor Java platform version 
45       3           1.0
45       3           1.1
46       0           1.2
47       0           1.3
48       0           1.4
49       0           1.5
50       0           1.6
51       0           1.7

Monday, February 25, 2013

Solaris find and xargs with files and directories containing spaces

Solaris does not ship with gnu standard tools like 'find' and 'xargs'. These two GNU commands play nicely together even when there are directory and file names with spaces (find -print0, xargs -0).

That is not the case for Solaris and other Unix systems. You need to get a little more creative there. Basically you will need to use an intermediate command to quote the resulting lines from find so xargs can parse them correctly.

As an example in Solaris you would find an occurrence of a pattern string inside all files starting at a given directory (in case blanks are expected in names) as shown below:
$ find /path/to/dir/ -name "*" | sed 's/.*/"&"/' | xargs grep "searchedPattern"

Agile Java deployment for all

Continuous Integration is a necessary milestone to achieve agility. It should be configured to publish snapshots in the artifacts repository to support teams organized on separation of concerns tactics.

There is no need to build again and again the application in many different machines, that just creates waste (more CPU usage for one). For instance your Data and Front End teams might need the latest version of the application from the trunk so they can work on some functionality depending on it. They use completely automated VDI boxes which host a specific application version deployed at the time of the initial build. It makes sense then to have a single command that would get the snapshot deployed in their dev boxes. A simple command they can run themselves.

Configuring Jenkins to deploy into Artifactory is a snap. First the artifactory server is configured in Jenkins as explained and then following that same URL the only thing to do to enable an automated push to artifactory is to go to the "Configure" option for the WAR project and locate "Post-build Actions", check "Deploy artifacts to Artifactory" and select the artifactory server.

You could build your own script that will take care of the magic to deploy the snapshot artifact in apache, tomcat and what not. A hint for such script in terms of parameters could be:
./remoteWarDeploy.sh username desktop.sample.com https://artifactory.sample.com/libs-snapshots-local/com/sample/cool-app/1.2.3-SNAPSHOT/cool-app-1.2.3-SNAPSHOT.war
Needless to say the same script could be used to deploy a released version as well.

Sunday, February 24, 2013

The Car Factory, The Software Shop and La Fabrica de Chorizos

Spain is well known for great sausages. A good friend of mine coined a term a while ago "La Fabrica de Chorizos" to refer to an important Organization Pattern in any company striving for Lean, Agile productivity. Basically the idea is that you do not need to be an expert in Business and all technologies the company uses to be productive day one. There must be a "production line" that is easy to "operate" and which results in constant "product delivery". There must be "no waste" (the creation of anything that is not valuable for the business).

In America the literal translation is not good enough and members of my team have suggested to call it the "Car Factory" instead of the "Sausage Factory". I think it is actually the best term even for other languages like Spanish. In fact the whole Agile movement was born as a result of Japanese Lean manufacturing approaches precisely in the Automotive Industry, more specifically derived from the success experienced by Toyota.

A Software shop needs then to function like the "Car Factory" otherwise the cost of operations and development will hold the team back resulting in competitive disadvantage. The best term is actually "Software Factory". Quoting David Anderson blue book:
Once a team is capable of focusing on quality, limiting WIP and delivering often, and balancing demand against throughput, they will have a reliable, trustworthy, software development capability: an engine for making software! A “software factory” if you will!

MAC OSX CIFS access and Windows account locked 0xC000006A

A MAC OSX is not commonly joined to a Windows domain (is that even possible?) but still from your MAC you access company resources like CIFS shares.

If your account gets locked in the DC and the "Source Workstation" says "\\workstation" most likely you have mounted the CIFS resource in a way that OSX tries to use NetBIOS name resolver but the real name of the machine cannot be resolved.
Logon attempt by: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0
Logon account: 
Source Workstation: \\workstation
Error Code: 0xC000006A
The above will be the result for at least one replicable test case I am sharing today. If you automount the CIFS like in:
$ sudo cat /etc/fstab
cifs.example.com:/path/to/foo /mnt/foo url url==cifs://myusername:wrongpassword@cifs.example.com/path/to/foo 0 0
$ sudo automount -vc
Then you will get lock after some attempts to list the content of /mnt/foo which will always results in an error:
$ ls /mnt/foo
ls: foo: Authentication error
How to make sure then that the MAC is correctly registered as the "Source Workstation" in the Securty event log? The sysadmin needs this to understand exactly from which machine the failed attempt was made.

Most likely you will be able to resolve this issue looking into DHCP and DNS. Is your DHCP updating DNS? If not most likely the DC will be unable to show in its event log (out of the box) the correct information. It will list the "Source Workstation" as "\\workstation"

Enabling Netlogon logging in the DC should be of big help while troubleshooting this kind of issue:
  1. Enable netlogon logging: nltest /dbflag:0x2080ffff
  2. Restart netlogon service
  3. Inspect logs from %windir%\debug\netlogon
  4. Disable netlogon logging: nltest /dbflag:0×0

Avoiding the issue

If you cannot join the domain then you should delete any keychain entry for your "domain\user" and you should manually change the password for the specific account which most likely is setup for emails and calendars (System Preferences|Internet Accounts)

Advent Geneva Solaris script works from console but not from cron or other scheduler like monit

Do not assume your script will work from a crontab entry nor from any other scheduler. In particular I like to use monit as it will send me an alert only on the first failure and will not bother me until the issue has been fixed. On the other side cron mail configuration in Solaris gets messy so monit can be a great replacement.

Just for the sake of an example that shows some tips for future use of monit as scheduler for scripts which use mixed shells, are run as different user, and where log files are needed besides the usual notification let me share this showcase.

Showcase 1

Advent Geneva backup is a script that extracts the AGA data and pushes it to an external repository, it could even restore that AGA remotely. It assumes an NFS mounting point is available and so we need to build a wrapper that will run every so often from monit, which will mount the needed path and will run the original backup script sending its output to a log file.

Solution

Here is such a wrapper script. Even though the original script is csh we use bash for our wrapper. Note we just ignore if the umount command is unsuccessful:
 
#!/bin/bash -e
# Monit wrapper for backup_geneva_aga.csh
# Monit needs a single script with no params which is run as root but backup_geneva_aga.csh needs to run as geneva and it needs lo log is output to a file
#

/usr/sbin/umount /mnt/genback; echo Ignore status for umount
/usr/sbin/mount /mnt/genback 
/usr/bin/su geneva -c '/export/home/geneva/scripts/backups/backup_geneva/backup_geneva_aga.csh > /export/home/geneva/scripts/backups/backup_geneva/backup_geneva_aga.log'
We might think our original backup script is great but it might be using some profile variables which will not be available by the time the scheduled process triggers, we do not want to just source the ~/.cshrc file as it for sure contains statements that will interfere with a non interactive shell process as it will be when run from a scheduler. Then we might have to declare such variables ourselves. We also want to use again the -e flag ro make sure any command failing will cause the script to return immediately with a status >0 :
 
#!/usr/bin/csh -e
setenv HOME /export/home/geneva
setenv KRFSBACKUPS $HOME/backups
setenv GVHOME /usr/advent/geneva-x.y.z
setenv PATH "$GVHOME/bin:${PATH}"
...
Finally we schedule using /usr/local/etc/monitrc:
 
...
check program backup_geneva_aga-monit-wrapper with path "/export/home/geneva/scripts/backups/backup_geneva/backup_geneva_aga-monit-wrapper.sh" with timeout 3600 seconds
 every "55 22 * * *"
 if status != 0 then alert
...

Showcase 2

There is a binary executable called recoveraga which we use from a bash script. The bash script works perfectly fine from command line but from cron it fails when trying to run recoveraga:
ERROR: /home/qabldx86_dasa/1000/1000u1/rel/aga/src/utils/agaDaemonBase.cpp(1170): AHS:00067: Couldn't remove AGA 4635 completely

Solution

As usual printing the environment variables (env command) allows to find out differences between running from an interactive session and running from cron. From such comparison it was clear that the SHELL variable was different (it was pointing to sh instead of bash). Using an export inside the script solved the issue:
export SHELL=/usr/bin/bash

Friday, February 15, 2013

Javascript accessing JSR303 or custom validation annotations

Wouldn't it be great to have all JSR-303, custom validation annotations, hibernate specific validations, type and so on accessible from the front end?
 
var clientValidations = ${ju:getValidations(client)};


Here is a new method for the JSONUtils.java I have been working on to support a richer application:
 
...
    /**
     * Given a target object it returns all fields and annotations for those related to validations: Map<String fieldName,
     * Map<String annotationName, Map<String annotationAttributeName, String annotationAttributeValue>>
     * 
     * @param target
     * @return
     * @throws IOException
     * @throws JsonMappingException
     * @throws JsonGenerationException
     */
    public static String getValidations(Object target) throws JsonGenerationException, JsonMappingException, IOException {
        Map<String, Map<String, Map<String, Object>>> validations = new HashMap<String, Map<String, Map<String, Object>>>();
        List<String> validationPackages = new ArrayList<String>();

        // JSR 303
        validationPackages.add("javax.validation.constraints");
        // Hibernate
        validationPackages.add("org.hibernate.validator.constraints");
        // Custom
        validationPackages.add("com.sample.validator.constraints");

        Field[] fields = target.getClass().getDeclaredFields();
        for (Field field : fields) {
            boolean accessible = field.isAccessible();

            if (!accessible) {
                field.setAccessible(true);
            }
            Annotation[] annotations = field.getAnnotations();
            Map<String, Map<String, Object>> validationsMap = new HashMap<String, Map<String, Object>>();
            for (Annotation annotation : annotations) {
                if (validationPackages.contains(annotation.annotationType().getPackage().getName())) {
                    String annotationName = annotation.annotationType().getName();
                    Map<String, Object> annotationAtributes = AnnotationUtils.getAnnotationAttributes(annotation);
                    Iterator<String> it = annotationAtributes.keySet().iterator();
                    while (it.hasNext()) {
                        String key = it.next();
                        // Remove unneeded attributes
                        if ("groups".equals(key) || "payload".equals(key)) {
                            it.remove();
                        }
                    }
                    validationsMap.put(annotationName, annotationAtributes);
                }
            }
            Map<String, Object> typeMap = new HashMap<String, Object>();
            typeMap.put(TYPE, field.getType());
            validationsMap.put(TYPE, typeMap);
            if (validationsMap.size() > 0) {
                validations.put(field.getName(), validationsMap);
            }
        }
        return toJson(validations);
    }
...
And then the JSONUtils.tld declaration:
 
<function>
        <name>getValidations</name>
        <function-class>com.sample.utils.JsonUtils</function-class>
        <function-signature>
            String getValidations(java.lang.Object)
        </function-signature>
    </function>
Here is an example of such JSON generated file. As you can see there are JSR-303, hibernate scpecific and custom validation constraints listed. Time for the Javascript magic!
 
{
   "privateClientId":{
      "com.sample.validator.constraints.PrivateClientId":{
         "message":"validation.privateClientId"
      }
   },
   "sampleDistributionEmailAddress":{
      "javax.validation.constraints.Size":{
         "min":0,
         "max":50,
         "message":"{javax.validation.constraints.Size.message}"
      },
      "org.hibernate.validator.constraints.Email":{
         "message":"validation.email"
      }
   },
   "name":{
      "org.hibernate.validator.constraints.NotEmpty":{
         "message":"validation.mandatoryField"
      }
   },
   "number":{
      "javax.validation.constraints.Pattern":{
         "flags":[

         ],
         "message":"validation.client.number",
         "regexp":"^\\d{8}$"
      }
   },
   "group":{
      "javax.validation.constraints.NotNull":{
         "message":"validation.mandatoryField"
      }
   },
   "difficultyLevel":{
      "org.hibernate.validator.constraints.Range":{
         "min":1,
         "max":5,
         "message":"{org.hibernate.validator.constraints.Range.message}"
      }
   }
   "numberOfEmployees":{
     "type":{
       "type":"java.lang.Integer"
     }
   }
}

Thursday, February 14, 2013

Parsing JSON with Talend

JSON parsing has been an issue in Talend for a while. Instead of keeping here the whole history I decided to keep this old post updated with just the latest issue found in Talend tFileInputJSON component. However it is worth mentioning that the next step after the component gets to a decent shape is to measure its performance which is an important issue closed with resolution "suggestion noted".

Basically as it stands the component is still not useful for real world scenarios where JSON data is mostly unbalanced or where the data can be represented by keys that are not necessarily alphanumeric starting with a letter or when the datasets are to be extracted from different nested levels.

The common way to go around this is to code everything needed in a tJavaFlex component like:
  1. Given the json:
    {"arr":[{"id":1, "next":2},{"id":2}, {"id":3, "next":4}]}
    
  2. We build the below sample job:
  3. Using the below in the tJavaFlex
    //imports
    import java.util.Map.Entry;
    import java.util.Set;
    import org.json.simple.JSONObject;
    import org.json.simple.JSONArray;
    import org.json.simple.parser.JSONParser;
    import java.util.ArrayList;
    import java.io.ByteArrayInputStream;
    import java.io.BufferedReader;
    import java.io.ByteArrayOutputStream;
    import java.io.FileReader;
    import java.io.InputStream;
    import java.io.InputStreamReader;
    
    //start
    JSONParser parser = new JSONParser();  
    JSONObject jsonObject = (JSONObject) parser.parse(new FileReader("/opt/samples/big.json"));
    JSONArray arr = (JSONArray) jsonObject.get("arr");
    for(int i = 0; i < arr.size(); i++)  {
    
    //main
        JSONObject item = (JSONObject) arr.get(i);
     Long itemId = (Long) item.get("id");
     Long next = (Long) item.get("next");
     row5.id = itemId;
     row5.next = next;
    
    //end
    }
  4. Once we run it we correctly get our list of id and next values:
    [statistics] connecting to socket on port 3472
    [statistics] connected
    1|2
    2|
    3|4
    [statistics] disconnected
    Job sample ended at 10:21 14/02/2013. [exit code=0]
    
Clearly that is a lot of code to be writing for a Data Analyst who should be concentrated in plumbing components to achieve Business Intelligence goals. The way to go here is to fix the tFileInputJSON component.

JsonPath

Using JsonPath allows for a shorter implementation though:
 //import
 ...
 import com.jayway.jsonpath.JsonPath;
 ...
 //main
 ...
 Long itemId = (Long) JsonPath.read(item, "$.id");
 Long next = (Long) JsonPath.read(item, "$.next");
 ...
Let us suppose we have a little bit more complicated JSON:
{"arr":[{"book":{"id":1, "next":2}}, {"book":{"id":2}}, {"book":{"id":3, "next":4}}]}
If you want to parse nested properties and the JSON is unbalanced like in this case (there is no "next for book.id=2) you will need to catch PathNotFoundException:
//import
 ...
 import com.jayway.jsonpath.JsonPath;
 import com.jayway.jsonpath.PathNotFoundException;
 ...
 //main
 ...
 Long itemId = "";
 Long next = "";
 //try needed only for nested path like in cases like "$.book.name"
 try {
   itemId = (Long) JsonPath.read(item, "$.book.id");
 } catch (PathNotFoundException e) {
 }
 try {
   next = (Long) JsonPath.read(item, "$.book.next");
 } catch (PathNotFoundException e) {
 }
 ...
This approach requires the below libraries which you can get from json-path project:
json-path-0.9.0.jar
slf4j-api-1.7.5.jar
json-smart.1.2.jar

External Parser

Even though JsonPath from Java should be OK you might find other surprises, who knows what ;-) In such case you might want to consider processing JSON with external tools. One of those tools is described here.

Wednesday, February 13, 2013

The method X is undefined for the type Y

Classloader issues, yes, we all know about them. They are behind 99% of the "misterious" behaviors Java developers face when suddenly a well known class is apparently lacking a method that is well known to exist there.

When you face this issues try including once again the class you think is in your classpath in a place where you do know the classpath loader is looking for. Also check for duplicates. If you are in doubt which of your jars have the class (or related package) then you need to inspect inside your jars.

Just for the sake of my memory this was the cause for the below error which was happening in one of the developer's machines:
 The method getJSONArray(String) is undefined for the type JSONObject
It was happening in Talend Open Source BTW where the tLibraryLoad sometimes is forgotten by the developer as in many cases the classes "seem" to be already loaded and available when in reality it depends on the Eclipse classloader algorithms. For instance include the missing class from the component and delete the component after, it will "magically" find it. Or play with the tFileInputJSON component for a while and try to use the JSONObject class from a tJavaFlex, it will be available. Most likely you restart Talend with no loading of the class at all and the "mistery" returns :)

Using tLibraryLoad and including the json_simple-1.1.jar which comes as part of the tFileJSONInput did the trick.

Create a big JSON file for test purposes

Sometimes you need to try your parser with huge files either for performance reasons or to find out its limitations. Talend tFileInputJSON has a bug in version 5 as stated in this post.

Any script will do the trick here. Here is a quick bash to generate such big json file:
generateBigJson.sh
#!/bin/bash
#generateBigJson.sh

MAX=15000
echo "{\"arr\":["
i=1
while [ $i -le $MAX ]; do
  echo -ne "{\"id\":$i, \"next\":$((i+1))}"
  if [ "$i" -ne "$MAX" ]; then
    echo ","
  fi
  let i=i+1
done
echo "]}"

Saturday, February 09, 2013

Exposing all Spring i18n messages to Angular.JS or any rich front end

Spring does a great job with internationalization. Simple, straightforward, great! However I need to easily access the messages from a rich Angular.JS based front end. For that matter it would be ideal to expose the messages in a JSON format that is loaded at the beginning of the app in the browser right? Then it makes sense to have a simple directive to load them all from JSP/JSTL, like:
<script>
  var messages = ${ju:getMessages(locale)};
</script>
By default you easily find methods to get key by key but when you need all the messages you will need to create your own MessageSource. So we declare it:
   <bean id="messageSource"
          class="com.sample.web.CustomReloadableResourceBundleMessageSource">
          <qualifier value="messageSource"/>
        <!--  <property name="basename" value="classpath:messages" />  -->
        <property name="basenames">
            <value>/WEB-INF/i18n/messages</value>
        </property> 
        <property name="cacheSeconds">
            <value>60</value>
        </property>
        <property name="fallbackToSystemLocale" value="false" />
    </bean>
The java code for the custom Message Source:
package com.sample.web;

import java.util.Locale;
import java.util.Properties;

import org.springframework.context.support.ReloadableResourceBundleMessageSource;

public class CustomReloadableResourceBundleMessageSource extends ReloadableResourceBundleMessageSource {

 public Properties getAllProperties(Locale locale) {
  clearCacheIncludingAncestors();
  PropertiesHolder propertiesHolder = getMergedProperties(locale);
  Properties properties = propertiesHolder.getProperties();

  return properties;
 }
}
A new method for our taglib named getMessages() (If you missed the tutorial for creating the first just search in this blog for it:
<?xml version="1.0" encoding="UTF-8"?>
<taglib xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemalocation="http://java.sun.com/xml/ns/j2ee/web-jsptaglibrary_2_0.xsd"
    version="2.0">
    <tlib-version>2.1</tlib-version>
    <uri>JsonUtils</uri>
     
    <function>
        <name>toJson</name>
        <function-class>com.sample.utils.JsonUtils</function-class>
        <function-signature>
            String toJson(java.lang.Object)
        </function-signature>
    </function>
    
    <function>
        <name>getMessages</name>
        <function-class>com.sample.utils.JsonUtils</function-class>
        <function-signature>
            String getMessages(java.util.Locale)
        </function-signature>
    </function>
</taglib>
The new utility method for the taglib:
package com.sample.utils;

import java.io.IOException;
import java.util.Locale;

import org.codehaus.jackson.JsonGenerationException;
import org.codehaus.jackson.map.JsonMappingException;

import com.sample.serialization.JacksonObjectMapper;
import com.sample.utils.web.ApplicationServletContextListener;
import com.sample.web.CustomReloadableResourceBundleMessageSource;

public final class JsonUtils {

 private JsonUtils() {
 }

 public static String toJson(Object value) throws JsonGenerationException, JsonMappingException, IOException {
  JacksonObjectMapper mapper = new JacksonObjectMapper();
  return mapper.writeValueAsString(value);
 }

 public static String getMessages(Locale locale) throws JsonGenerationException, JsonMappingException, IOException {
  CustomReloadableResourceBundleMessageSource messageSource = (CustomReloadableResourceBundleMessageSource) ApplicationServletContextListener
    .getBean("messageSource");
  JacksonObjectMapper mapper = new JacksonObjectMapper();
  return mapper.writeValueAsString(messageSource.getAllProperties(locale));
 }
}
Finally don't forget to inject the locale in your View from Controller. I prefer a single "locale" variable so the front end engineer just have to use the simple statement that started this post.

java.lang.ClassCastException: CustomReloadableResourceBundleMessageSource cannot be cast to org.springframework.context.support.DelegatingMessageSource

I needed to hook into Spring to use my own MessageSource for internationalization. I declared my custom bean and when I tried to use the new functionality I got:
java.lang.ClassCastException:  com.sample.web.CustomReloadableResourceBundleMessageSource cannot be cast to org.springframework.context.support.DelegatingMessageSource
This error happens when Spring cannot find the bean and instead initializes a default. If you inspect DEBUG traces you will see that clearly.

Hence this is a classloader issue which in my case was due to the fact that I was not declaring the bean in the application context xml but rather in the spring servlet xml. It took me some hours to realize what was really happening.

mount error(115): Operation now in progress ... CIFS VFS: cifs_mount failed w/return code = -115

Trying to mount a CIFS path in Ubuntu and getting:
mount error(115): Operation now in progress
First thing to try is to look into /var/log/syslog:
Feb  9 14:08:29 ldap kernel: [143452.140157] CIFS VFS: Error connecting to socket. Aborting operation
Feb  9 14:08:29 ldap kernel: [143452.140492] CIFS VFS: cifs_mount failed w/return code = -115
Socket error, we know what this is right? IP or port. Ping for domain/IP or Telnet to test port:
ping IP
telnet IP 445
My case? Telnet was timing out, port closed for IP in firewall.

Wednesday, February 06, 2013

Custom JSP taglib to convert Object to JSON

The times for direct javascript DOM manipulation are gone. Modern frameworks like Angular.js demand JSON to be available. Finally if you use a hybrid approach where there is a need for server side HTML templates, the need to expose objects in JSON format will be unavoidable.

Using simple JSP views thorugh the help of taglibs should be a clean way to give front end developers what they need. They should be able to get a JSON for their javascript needs in a single line:
<script>
  var employees = ${ju:toJson(employees)};
</script>
Here is all you need to do in the back-end(middle tier). First you need a utility class that can take an object and serialize it to JSON. The winner in my journey so far is Jackson API so:
package com.nestorurquiza.utils;

import java.io.IOException;

import org.codehaus.jackson.JsonGenerationException;
import org.codehaus.jackson.map.JsonMappingException;

import com.nestorurquiza.serialization.JacksonObjectMapper;

public final class JsonUtils {

 private JsonUtils() {
 }

 public static String toJson(Object value) throws JsonGenerationException, JsonMappingException, IOException {
  JacksonObjectMapper mapper = new JacksonObjectMapper();
  return mapper.writeValueAsString(value);
 }

}
Here is the custom JsonObjectMapper class:
package com.nestorurquiza.serialization;

import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.SerializationConfig;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class JacksonObjectMapper extends ObjectMapper {

 private static final Logger log = LoggerFactory.getLogger(JacksonObjectMapper.class);

 public JacksonObjectMapper() {
  configure(SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS, false);
 }
}

Here is the taglib descriptor file /WEB-INF/tld/Json.tld:
<?xml version="1.0" encoding="UTF-8"?>
<taglib xmlns="http://java.sun.com/xml/ns/j2ee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemalocation="http://java.sun.com/xml/ns/j2ee/web-jsptaglibrary_2_0.xsd"
    version="2.0">
    <tlib-version>2.1</tlib-version>
    <uri>JsonUtils</uri>
     
    <function>
        <name>toJson</name>
        <function-class>com.nestorurquiza.utils.JsonUtils</function-class>
        <function-signature>
            String toJson(java.lang.Object)
        </function-signature>
    </function>
</taglib>
Finally do not forget to include the taglib to use from JSP:
<%@ taglib prefix="ju" uri="/WEB-INF/tld/JsonUtils.tld"%>

Monday, February 04, 2013

CouchDB keep Design Documents with views formatted

Unix Power Tools work. In many cases from command line you can practically do anything you want way faster and consuming way less resources than from a GUI. When it comes to automation a GUI can't compete so there are even scenarios where you really have no option other than scripting it.

In a NoSQL DB you probably have no schema. I will not argue here about structured/semistructured/unstructured choices. That is a subject for a bigger discussion for which my simple answer is "- it depends". But I will argue here that your Views in CouchDB or whatever you call the map/reduce functions in your favorite noSQL must be kept in a version control repository.

I will also argue that any migrations applied to Views should automatically generate the latest version of them and should be committed to your version control system.

I should not have to say it but all of the above should be automated for sure.

Storing a compressed or minified source code (in couchdb that would be javascript) does not make sense to me so I would vote for storing the extracted views correctly formatted. A one-liner will be enough to achieve this. Note that below curl uses k(to ignore ssl fake certificate of this integration environment), s (to avoid showing the progress) and X (to specify the method in this case a GET). Python formats the output.
$ curl -ksX GET "https://user:passwd@couch.sample.com:6984/mydb/_design/MyDocument "| python -mjson.tool
Granted this is still not that great as the javascript functions do not get formatted but yet better than the original output of the command without the python help.

To use the design document later on in couchDB you need to PUT it in the couchdb server database, however the "_rev" attribute must be removed first. Again nothing Unix Power Tools won't be able to do. Here is how to do a correct backup:
#extract design document from db
curl -ksX GET "https://user:passwd@couch.sample.com:6984/mydb/_design/EmailDocument" | python -mjson.tool | sed '/\"_rev\"/d' > mydb-couchdb-design-Document.json
Here is how you would put the design document back into a couchdb database (restore). Note how we need the revision to be able to delete and then after deleted we just PUT our saved copy. We can probably do the same with just one POST instead of DELETE+PUT but you get the idea about the power of unix tools here.
$ revision=`curl -ksX GET "https://user:passwd@udesktop2.sample.com:6984/mydb/_design/MyDocument" | sed 's/.*_rev\":\"\([^\"]*\).*/\1/g'`
$ curl -ksX DELETE "https://user:passwd@udesktop2.sample.com:6984/mydb/_design/MyDocument?rev=$revision"
$ curl -ksX PUT "https://user:passwd@udesktop2.sample.com:6984/mydb/_design/MyDocument" --data-binary @mydb-couchdb-design-Document.json
Now you can keep your views in version control and should you need a blank database it is just a matter of creating it and running the previous code to create the views:
#extract design document from db
curl -ksX GET "https://user:passwd@couch.sample.com:6984/mydb/_design/EmailDocument" | python -mjson.tool | sed '/\"_rev\"/d' > mydb-couchdb-design-Document.json
#delete database
curl -ksX DELETE "https://user:passwd@udesktop2.sample.com:6984/mydb"
#create database
curl -ksX PUT "https://user:passwd@udesktop2.sample.com:6984/mydb"
#put the design document in db
630  curl -ksX PUT "https://user:passwd@udesktop2.sample.com:6984/mydb/_design/MyDocument" --data-binary @mydb-couchdb-design-Document.json

Friday, February 01, 2013

Replace tabs by whitespace across multiple files

A new member of the team did a great job refactoring several JSP files however he had set his editor to use literal tabs and our current code style is to use 4 literal spaces whenever you use tab.

I couldn't help myself proposing to use Unix Power Tools to resolve this issue. So here is the one liner that corrected all JSP files. Here are two options, one with perl and the other without it:
find ./jsp/ -name "*.jsp" | xargs perl -p -i -e 's/\t/    /g'
find ./jsp/ -name "*.jsp" | xargs sed -i 's/\t/    /g'

SwfUpload plugin and Spring MaxUploadSizeExceededException

IN order to show the user an error message the SwfUpload plugin needs a plain text response from the server using the keyword "ERROR:" like in "ERROR: Maximum upload size exceeded" however when Spring throws the below exception the request does not get even to the controller:
org.springframework.web.multipart.MaxUploadSizeExceededException: Maximum upload size of 1000000 bytes exceeded; nested exception is org.apache.commons.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (2097682) exceeds the configured maximum (1000000)
In order to resolve this issue you need to implement HandlerExceptionResolver in your controller which basically means you need to add a method to it.
@Override
 public ModelAndView resolveException(HttpServletRequest request, HttpServletResponse response, Object object,
   Exception exception) {
  if (exception instanceof MaxUploadSizeExceededException) {
   String message = exception.getMessage();
   // The below will still render the default exception handler view
   /*try {
    response.setContentType("text/plain");
    response.getWriter().write(error(message));
    response.flushBuffer();
   } catch (IOException e) {
    log.error(null, e);
    errorModelAndView();
   }*/

   // Returning a Plain Text View
   return new ModelAndView(new PlainTextView(error(message.substring(0, message.indexOf(";")))));

  } else {
   return errorModelAndView();
  }
 }

 private ModelAndView errorModelAndView() {
  return new ModelAndView("/error?id=error.internal");
 }
Note the commented lines. Unfortunately it is not as easy as just write to the response. Spring expects a valid ModelAndView object as response. That is the reason why I had to create a special PlainTextView:
package com.nestorurquiza.utils.web;

import java.io.PrintWriter;
import java.util.Map;

import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import org.springframework.web.servlet.View;

public class PlainTextView implements View {
 private static final String CONTENT_TYPE = "plain/text";
 public static final String BODY = "body";
 private String body;

 public PlainTextView(String body) {
  super();
  this.body = body;
 }

 public PlainTextView() {
  super();
  this.body = null;
 }

 public String getContentType() {
  return CONTENT_TYPE;
 }

 public void render(Map model, HttpServletRequest request, HttpServletResponse response) throws Exception {
  response.setContentType(CONTENT_TYPE);
  PrintWriter out = response.getWriter();
  out.write(body);
 }
}

Followers