Seiten

Sonntag, 25. Mai 2014

Vaadin Valo: The new theme (since version 7.3)

Vaadin's roadmap 

Vaadin released the most current release 7.2 in the middle of may. It offers some interesting features like responsive layouts for components (letting components react on size changes and set new css-styles depending on the current size), native support for IE11 and improved push channel with support for long polling, Tomcat 8, Wildfly 8, Glassfish 4, Jetty 9.1. To be honest, I didn't already try all new features.

As you can read on their roadmap (here or here) the next release 7.3 (to be released in june) will contain a new theme called 'Valo' which looks pretty nice and which I wanted to try for a new project. There are already two alpha versions available for it and I will show you later in this post how I got it running.

Then there will be a version 7.4 which will contain a new component 'Grid' which seems to become very important. By now the Table-component seems to be one of the core components used in Vaadin-applications. If we believe the Vaadin guys, Grid will become Table's big brother with lots of improvements and new functions that the Table can't offer (e.g. dynamic row heights). I'm very curious about it and it really seems to become a big component because its release will be splitted in two parts. A first version of Grid will be released with version 7.4 but the following release 7.5 will mainly focus on extending the Grid-component.

Valo

Let's focus on the new theme 'Valo'. The Vaadin guys posted a preview-picture of the new styles and in my opinion this looks really nice:

https://vaadin.com/image/image_gallery?uuid=ee33077b-bf15-42a3-bd37-b2d2955502e9&groupId=10187&t=1396549547676


As I already wrote it will be included in release version 7.3 in june. There are already two beta versions - alpha1 and alpha2 - which you can use to start experimenting.

Here is what I did to get a Vaadin-Project running with the new theme in version 7.3.0.alpha2 (I mainly followed the instructions from the release notes but at some points I had to go a slightly different way):


  1. Create a new maven project from the vaadin archetype "vaadin-archetype-application" with version 7.3.0.alpha2
  2. mvn clean install
  3. Navigate to the exploded war-directory and from there navigate to WEB-INF/lib
  4. Extract "vaadin-themes-7.3.0.alpha2.jar" as it contains the needed "valo" folder
  5. Copy the folder VAADIN/themes/valo (from the extracted vaadin-themes-7.3.0.alpha2 folder) to the folder src/main/webapp/VAADIN/themes in your Vaadin-project. The "valo" folder should now exist next to your "mytheme"-folder.
  6. Change the content of the "mytheme.scss"-file in your "mytheme"-folder to the following:
    // Any variables you wish to override should be done before importing Valo
    
    // Modify the base color of the theme
    $v-app-background-color: hsl(200, 50%, 50%);
    
    @import "../valo/valo";
    
    @mixin mytheme {
      @include v-valo;
    }
  7. Install ruby (because it contains the needed sass-compiler which will be included in the final 7.3-release but which is not included in the alpha-version). You can download and install it from http://www.rubyinstaller.org/. I installed it on Windows and after installing it I used the console to install the sass-compiler via ruby by entering the following command: gem install sass
  8. Navigate via console to your project and into the folder src/main/webapp/VAADIN/mytheme
  9. Enter the following command: sass styles.scss styles.css
  10. Go into your project's pom.xml. Comment out the <goal>compile-theme</goal> - entry in the "vaadin-maven-plugin". Look at the update at the end of this posting for more explaination on this.
  11. Build (e.g. by running mvn install) and deploy your war file to your web-/applicationserver - Done!
IMPORTANT: Have a look at your styles.css-file in your mytheme-folder before trying to deploy the application. It should be filled with many thousands of lines. And THAT is the right content. If you have a content like
@import "../valo/valo";

.mytheme {
 name: v-valo args: []
}
then the vaadin sass-compiler (which doesn't work in that alpha version) was run, which overwrites the right content and crashes our theme! This might happen if you didn't comment out the goal-entry from the vaadin-maven-plugin (see step 10) and then run a mvn install.

I was wondering why the styles.css was overwritten when a mvn install was executed because this normally only happens after a vaadin:compile-theme task (from the vaadin maven plugin) was run and I didn't expect that task to be run when executing a install-task. But as it seems..it does. <-- Update: Step 10 will fix that.

Result

That's how the Vaadin-Sample-Application looks like with Valo:



Beautiful, isn't it?

----------

Update 27.05.2014:
In the first version of this posting step 10 was "Check if the styles.css has the right content" and I wrote that I couldn't run mvn install because it overwrote the styles.css with the wrong content. So I had to let my IDE make the war-file instead of using mvn install. I fixed that now.

Explaination: The task "compile-theme" from the vaadin-maven-plugin tries to compile the content of styles.scss and write the compiled content to styles.css (I don't know how that scss-stuff works in detail). BUT the compiler used by the compile-theme-task doesn't work in the used alpha-version (the Vaadin guys say it will work in the final release). Because it doesn't work we need to install a compiler which works the correct way (we do this by installing ruby and running the command from step 8 and 9). After running that command the styles.css is valid and we don't want to run the compile-theme task anymore because it will overwrite the styles.css with invalid content. The vaadin-maven-plugin says by its <execution> --> <goal>-entries that those <goal> entries should be run when mvn package or install is called. That's the reason why we have to comment out the compile-theme-task in our pom.xml. After doing that we can just use mvn install as always and then deploy the war-file.

Mittwoch, 21. Mai 2014

Simple Forge for Rapid Development

This article covers the simple usage of Forge to build up an application (or at least an almost complete stack for a prototype).
So what is Forge? It is a command line tool to create and configure Java projects. You can setup different modules (for example cdi or jpa) and Forge generates all resources needed. This can be anything from java classes to deployment descriptors right up to java test classes or JSF files. On top of that this behavior is incremental, so that you can use it at any time you want.

The setup of the modules is done by Convention over Configuration so that everything is quite simple and comprehensible. The Command line also supports tab-completion and makes the usage very fluent.

You may ask why not Forge 2 (it is already in version 2.5.0). It is faster and a little bit easier but it lacks in one point: the Arquillian module is currently not migrated but it is planed.

Example
At first we have to tell Forge to use defaults. Then we create a project with name forgetest and the topLevelPackage de.bischinger. With this a Maven War project is created.
[no project] Development $ set ACCEPT_DEFAULTS true;
[no project] Development $ new-project --named forgetest --topLevelPackage de.bischinger;
After this you can see that the shell changes from "no project" to your project name "forge test".
Now we want to add persistence namely Hibernate and WildFly (creates a persistence.xml and adds dependency to pom.xml)
[forgetest] forgetest $ persistence setup --provider HIBERNATE --container WILDFLY;
Create a JPA entity customer with a required field name. For this we need also to setup JPA validation.
[forgetest] forgetest $ validation setup --provider HIBERNATE_VALIDATOR;
[forgetest] forgetest $ entity --named Customer --package de.bischinger.model;
[forgetest] Customer.java $ field string --named name;
[forgetest] Customer.java $ constraint NotNull --onProperty name;
The interesting part is once again that after the creation the shell switches to the created context so that you can easily create the fields. You can also list the context with ls.

Now we want to build a simple JSF UI for the customer entity. At first we need to setup Scaffold which generates our UI.
[forgetest] Customer.java $ scaffold setup;
[forgetest] Customer.java $ scaffold from-entity de.bischinger.model.Customer.java;
After this we can build our application with build and after deploying (which can also be done with Forge but this would be another topic) we can see the following JSF-Page. On the left side is our link to the customers where we can already create, search and delete customers. It also has a paging feature.
Ok - the GUI still must be customized but this is again another topic. The point is that i have spent so much time in the past to reach a state like this - now it is possible within just 2 minutes.

But we are not finished yet - what about tests? For this we can install the Arquillian plugin into our project. We setup Arquillian to use a WildFly-Managed-Container with JUnit after which we can create our tests.
[forgetest] Customer.java $ forge install-plugin arquillian;
[forgetest] Customer.java $ arquillian setup --containerName WILDFLY_MANAGED --testFramework junit;
[forgetest] Customer.java $ arquillian create-test --class de.bischinger.model.Customer.java;
Done.

Conclusion

From my point of view this tool is worth a lot. I can integrate and configure almost any standard technology (JPA, Validation...) i want in no time. This way i can really concentrate on the business. Forge is nicely integrated in the JBoss Developer Studio, it can also be used on any shell and it is easy to understand. The generated sources can also be studied to familiarize with unknown technologies.

Mittwoch, 14. Mai 2014

CDI ContextResolver Pattern 2.0

What is the ContextResolver Pattern?
It is a pattern describe by Sven Ruppert (here) to solve the following problem.

The Problem
A service has several implementations which are provided to clients depending on a specific environment context (for example: test- or developmentcontext) on the service side. The client does not know about the context and the environment context must be dynamically configurable.

The Solution
Decouple the service creation from the context resolving by introducing
  • a ContextResolver which determines the current context and returns an annotation literal
  • a Service Context Qualifier
  • a service producer which uses the servicecontextqualifier
With that you can develop very flexible and extendable modules or applications which can be dynamically configured at runtime.

The Evolution
The previous version is implemented with CDI extensions which is a little bit harder to understand and needs a javax.enterprise.inject.spi.Extension file. So here is the improved version which uses only plain CDI-Producers and therefore should be easier to understand.

The client:
@Inject
@DemoLogicContext
Instance<DemoLogic> demoLogicInst;

The producer:
public class DemoLogicProducer
{
 @Produces
 @DemoLogicContext
 public DemoLogic create(BeanManager beanManager, @Any Instance<ContextResolver> contextResolvers)
 {
  return ManagedBeanCreator.createManagedInstance(beanManager, contextResolvers, DemoLogic.class);
 }
}

The ContextResolver:
public class DemoLogicContextResolver implements ContextResolver
{
 @Inject
 Context context;

 @Override
 public AnnotationLiteral<?> resolveContext(Class<?> targetClass)
 {
  //Determines the context and returns annotionliteral 
  return context.isUseB() ? new MandantB.Literal() : new MandantA.Literal();
 }
}

The ManagedBeanCreator:
public class ManagedBeanCreator
{
 public static <T> T createManagedInstance(BeanManager beanManager, Instance<ContextResolver> contextResolvers,
   Class<? extends T> clazz)
 {
  //FindFirst
  for (ContextResolver contextResolver : contextResolvers)
  {
   AnnotationLiteral<?> annotationLiteral = contextResolver.resolveContext(DemoLogic.class);
   Set<Bean<?>> beans = beanManager.getBeans(clazz, annotationLiteral);

   //Create CDI Managed Bean
   Bean<?> bean = beans.iterator().next();
   CreationalContext<?> ctx = beanManager.createCreationalContext(bean);
   return (T) beanManager.getReference(bean, clazz, ctx);
  }
  return null;
 }
}

The sources can be found here.

Have fun coding.

Montag, 12. Mai 2014

Mocking IoT Tinkerforge Sensors

A couple of days have passed since the TinkerForge-API has been published on maven central. But what can you do with if there is no hardware around. This was my situation as i wanted to try it out and play with the weatherstation from jaxenter (weatherstation).
So what can you do now? You can either mock the hardware sensor and struggle with the protocol of the sensors or you can mock the software sensor. The latter approach seems easier to me. So lets start.
If you only want to have a single value request (e.g. getTemperature()) the task is easy. You only have to override the method on sensor instantiation.
new BrickletTemperature("dV6", new IPConnection()){
    @Override
    public short getTemperature() throws TimeoutException, NotConnectedException {
           return 42;
    }
};
The more interesting task is to hack the callback listeners so that you continuously get measured values. Ok, this would be no real data but it should be sufficient for testing and playing. To hack the sensor we need to dive a little bit into the internals of the sensors. If you add a listener to a sensor it is put into an internal list which is processed if a specific callback event (for example for temperature) is fired. In this process the changed value is passed.
So the task is to create a thread which fires a specific callback event on a sensor.
//Creates Standard-BrickletTemperature with injected IPConnection
BrickletTemperature brickletTemperature = new BrickletTemperature("kjh6", ipConnection);

//Determine the callback event constants (note: callbackreached listeners are ignored)
List ints = new ArrayList<>();
Field[] declaredFields = deviceClass.getDeclaredFields();
for (Field declaredField : declaredFields) {
    String fieldName = declaredField.getName();
    if (fieldName.startsWith("CALLBACK_") && !fieldName.endsWith("REACHED")) {
         declaredField.setAccessible(true);
         int callbackIndex = declaredField.getInt(device);
         ints.add(callbackIndex);
    }
}

//Creates a Value generator for the bricklet
try {
      createBrickletMock(ipConnection, brickletTemperature, (byte) 200, ints);
} catch (NoSuchMethodException e) {e.printStackTrace();}
The callback events of each sensor are mapped to a constant in the sensor class which starts with CALLBACK (callback reached events have CALLBACK_<name>_REACHED) so that you can use reflection to read the values.
static <Bricklet extends Device> void startCallbackListenerThread(IPConnection ipcon, Device bricklet, byte startValue, List<Integer> callbackIndizes) throws NoSuchMethodException {
        Class<IPConnection> ipConnectionClass = IPConnection.class;
        Method callDeviceListener = ipConnectionClass.getDeclaredMethod("callDeviceListener", Device.class, byte.class, byte[].class);
        callDeviceListener.setAccessible(true);

        //start thread for each callback event
        for (int callbackIndex : callbackIndizes) {
            new Thread(() -> {

                try {

                    Random random = new Random();
                    while (true) {

                        //Generates values -1, 0 or 1
                        int randomDiff = random.nextInt(3) - 1;

                        //Invoke on device
                        callDeviceListener.invoke(ipcon, bricklet, (byte) callbackIndex, new byte[]{0, 0, 0, 0, 0, 0, 0, 0, (byte) (startValue + randomDiff), 0});


                        //wait 5s
                        Thread.sleep(THREAD_SLEEP_MILLIS);
                    }
                } catch (IllegalAccessException | InvocationTargetException | InterruptedException e) {
                    e.printStackTrace();
                }
            }).start();
        }
    }
So have fun coding.