Upgrading from a Spring boot application with JUnit 4 to JUnit 5, Jupiter

I realised this is only doable with spring 5

To migrate from JUnit 4 to JUnit 5 you can replace @RunWith(SpringRunner.class) with @ExtendWith(SpringExtension.class).

Unfortunately, spring-boot version 1.5.9-RELEASE is based on Spring 4 and the SpringExtension is only available since Spring 5.

Source: https://stackoverflow.com/questions/48019430/junit5-with-spring-boot-1-5

and  http://www.baeldung.com/junit-5-runwith

 

Current dependency:

Exclude the transitive Junit 4 dependency from the Spring boot test dependency

Current:

<dependency>
 <groupId>org.springframework.boot</groupId>
 <artifactId>spring-boot-starter-test</artifactId>
 <scope>test</scope>
</dependency>

After:

<dependency>
  <groupId>org.springframework.boot</groupId>
  <artifactId>spring-boot-starter-test</artifactId>
  <scope>test</scope>
  <exclusions>
    <exclusion>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
    </exclusion>
  </exclusions>
</dependency>

 

This will break all your imports with of junit in your test classes. And your vigilant IDE should be compalining already about these.

import org.junit.Test;
import org.junit.runner.RunWith;

Down also goes your tags:

@RunWith(SpringRunner.class)
@Test

You should now be able to use your IDE assist features to generate add teh depency to your pom like show in the following image for intellij.  Or simple copy the depency from the pom snippet below.

add_junit5_to_classpath

 

Snippet here of dependecny

 

Make things easy and do a global find/replace:

import org.junit.Test; ->  import org.junit.jupiter.api.Test;

import org.junit.runner.RunWith; –> import org.junit.jupiter.api.extension.ExtendWith;

import org.springframework.test.context.junit4.SpringRunner; –>

@RunWith(SpringRunner.class) –> @ExtendWith(SpringExtension.class)

 

 

 

 

 

Graph DB Connectors – elasticsearch example

Semantic Search gets the power of Full Text Search

 

Pre-requisites:

  1. An installed instance of GraphDB (currently only the OntoText Enterprise edition has connectors)
  2. An installed instance of Elasticsearch
    1. With port 9300 open and running (this can be configured in */config/elasticsearch.yml or through your puppet/chef)
    2. If you are running this on Vagrant ensure all ports are forwarded to your host [9200, 9300, 12055 etc]

 

Prepare GraphDB

  1. Setup GraphDB location

Setup Repository and switch it on to default

GrapghDB Locations And Repo

Create Elasticsearch Connector

  1. Go to the SPARQL tab
  2. Insert your query like bellow and hit run

 


PREFIX : &amp;lt;http://www.ontotext.com/connectors/elasticsearch#&amp;gt;
PREFIX inst: &amp;lt;http://www.ontotext.com/connectors/elasticsearch/instance#&amp;gt;

INSERT DATA {inst:my_index :createConnector '''
{
  "elasticsearchCluster": "vagrant",
  "elasticsearchNode": "localhost:9300",
  "types": ["http://www.ontotext.com/example/wine#Wine"],
  "fields": [
    {"fieldName": "grape",
      "propertyChain": [
        "http://www.ontotext.com/example/wine#madeFromGrape",
        "http://www.w3.org/2000/01/rdf-schema#label"
      ]},
    {"fieldName": "sugar",
      "propertyChain": [
        "http://www.ontotext.com/example/wine#hasSugar"
      ],"orderBy": true},
    {"fieldName": "year",
      "propertyChain": [
        "http://www.ontotext.com/example/wine#hasYear"
      ]}]}
''' .
}

3.  Go over to Elasticsearch and confirm that you have a newly created index [my_index], this will be empty for now

4.  Example debugging to do is check for the listed Connectors and its status:


PREFIX : &amp;lt;http://www.ontotext.com/connectors/elasticsearch#&amp;gt;

SELECT ?cntUri ?cntStr {
  ?cntUri :listConnectors ?cntStr .
}

PREFIX : &amp;lt;http://www.ontotext.com/connectors/elasticsearch#&amp;gt;

SELECT ?cntUri ?cntStatus {
  ?cntUri :connectorStatus ?cntStatus .
}

 

Insert Data in GraphDB

 

  1. The Connector should listen in for any data changes and insert/update/sync the accompanying elastic copy.

 


@prefix rdf: &lt;http://www.w3.org/1999/02/22-rdf-syntax-ns#&gt; .
@prefix rdfs: &lt;http://www.w3.org/2000/01/rdf-schema#&gt; .
@prefix xsd: &lt;http://www.w3.org/2001/XMLSchema#&gt; .
@prefix : &lt;http://www.ontotext.com/example/wine#&gt; .

:RedWine rdfs:subClassOf :Wine .
:WhiteWine rdfs:subClassOf :Wine .
:RoseWine rdfs:subClassOf :Wine .

:Merlo
    rdf:type :Grape ;
    rdfs:label "Merlo" .

:CabernetSauvignon
    rdf:type :Grape ;
    rdfs:label "Cabernet Sauvignon" .

:CabernetFranc
    rdf:type :Grape ;
    rdfs:label "Cabernet Franc" .

:PinotNoir
    rdf:type :Grape ;
    rdfs:label "Pinot Noir" .

:Chardonnay
    rdf:type :Grape ;
    rdfs:label "Chardonnay" .

:Yoyowine
    rdf:type :RedWine ;
    :madeFromGrape :CabernetSauvignon ;
    :hasSugar "dry" ;
    :hasYear "2013"^^xsd:integer .

:Franvino
    rdf:type :RedWine ;
    :madeFromGrape :Merlo ;
    :madeFromGrape :CabernetFranc ;
    :hasSugar "dry" ;
    :hasYear "2012"^^xsd:integer .

:Noirette
    rdf:type :RedWine ;
    :madeFromGrape :PinotNoir ;
    :hasSugar "medium" ;
    :hasYear "2012"^^xsd:integer .

:Blanquito
    rdf:type :WhiteWine ;
    :madeFromGrape :Chardonnay ;
    :hasSugar "dry" ;
    :hasYear "2012"^^xsd:integer .

:Rozova
    rdf:type :RoseWine ;
    :madeFromGrape :PinotNoir ;
    :hasSugar "medium" ;
    :hasYear "2013"^^xsd:integer .

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

TypeScript is great but I just want to write my Angular2 app in Java8

It is really hard to look at any other alternative to TypeScript for Angular2 development. Not only is it easy to learn it is also far less error prone compared to developing in JS as you get static type checking for classes interfaces and so on.

Sometimes you just want to write Angular2 app in java8 and hopefull java9 soon. This is particularly a non-brainer if its only a small application where in splitting the app into multiple component (WebClient, Service & Backend REST-API) is not really worth the overhead.

angular2boot intellij Idea
angular2boot intellij Idea

 

It is really hard to look at any other alternative to TypeScript for Angular2 development. Not only is it easy to learn it is also far less error prone compared to developing in JS as you get static type checking for classes, interfaces and so on.

Sometimes you just want to write Angular2 app in java8 and hopefull java9 soon.  This is particularly a non-brainer if it’s only a small application where in splitting the app into multiple component/tiers (WebClient, Service & Backend REST-API) is not really worth the overhead.

I just want to write my Angular2 app in Java8

Enters Angular2Boot

  • Write angular2 app in java8
  • Framework built on top of Angular 2, GWT and Spring Boot
    • Gwt used to compile to JS
    • You are free to mix GWT and Angular, but why if not to deal with legacy only.

Benefits

  1. TypeScript is good but here you get an even stronger typed OO language in Java
  2. Numerous tried and tested tooling and IDEs aroung for java.
  3. And for when you need one single jar, nothing better than one uber springboot jar! Not to mention the simplicity and ease of Springboot, especially when building POCs
  4. And lets face it, java is the language of choice for building robust applications

 

Give it a try (5minute)

 

Create Project

Generate Angular and Gwt App from archetype template

mvn archetype:generate \

 -DarchetypeGroupId=fr.lteconsulting \

 -DarchetypeArtifactId=angular2-gwt.archetype \

 -DarchetypeVersion=1.6
  • This will then do the scanning and downloading of dependences etc. You’ll then be prompted to feed in a few more details:
# Define value for property 'groupId': com.mosesmansaray.play
# Define value for property 'artifactId': angular-gwt-in-java8-example
# Define value for property 'version' 1.0-SNAPSHOT: :
# Define value for property 'package' com.mosesmansaray.play: :
  • Then confirm properties configuration to complete

Install

To install/produces and executable fat jar

mvn clean install

  • The above will complete the download of further dependences needed to compile the application.
  • It should then be ready in your application target folder e.g.

/angular-gwt-in-java8-example/target/angular-gwt-in-java8-example-1.0-SNAPSHOT.jar

Run

To run the fat jar

java -jar target/angular-gwt-in-java8-example-1.0-SNAPSHOT.jar

Development

Developing/Live reload

  • Backend mvn spring-boot:run
  • Frontend mvn gwt:run-codeserver

Resources

  1. Documentation and More From Here at lteconsulting.fr
  2. Libary SourceCode
  3. Or checkout Arnaud Tournier’s talk at GWT con 2016 bellow:
    1. Youtube quick run through
    2. Speaker Decks
  4. Angular2boot Tour of Heroes Tutorial
  5. Demos on github
  6. The angular2-gwt.archetype

Elasticsearch Ransomware

TLDR:

  1. Use X-Pack if you can,
  2. Do not expose your cluster to the internet,
  3. Do not use default configurations e.g. ports,
  4. Disable http if possible,
  5. If it must be internet facing: run behind a firewall, reverse proxy – Nginx (see example config), VPN etc,
  6. Disable Scripts,
  7. Regular back-up of your data with curator if you are not already.

Well, we all see that coming, didn’t we?  Once MongoDB started being ransom by criminals other No-SQL type technologies are surely on queue to follow. Now Elasticsearch Ransomware, no surprise neither that most Elasticsearch clusters are open to the internet.  Goes without saying even secure ones are mostly behind week/guessable passwords, default ports with unneeded http enabled.

The attackers are currently empting out clusters with a note left behind for payment:

 “Send 0.2 BTC (bitcoin)to this wallet xxxxxxxxxxxxxx234235xxxxxx343xxxx  if you want recover your database! Send to this email your service IP after sending the bitcoins xxxxxxx@xxxxxxx.org”

Rest assured if your are using elastic cloud you will be protected by their default shield/x-Pack protection.  To protect your self hosted cluster, the team at Elastic have posted a guide here.  Such a guide really should not be news to any Elasticsearch admin! If it is then action is nigh!

There is also a detailed step by step guide on all things securing your Elasticsearch cluster: “Don’t be ransacked: Securing your Elasticsearch cluster properly” by Itamar Syn-Hershko

So far its been mostly Amazon exposed services.  But the same Elasticsearch Ransomware techniques against an unsecure (wrongly configured) Elasticsearch instance can be applied to any other hosted/self Elasticsearch service.

Cleaning Elasticsearch Data being indexed

Sometimes we just don’t have control over the source of data coming into our elasticsearch indices.  In such cases cleaning Elasticsearch data and removing unwanted data such as html tags before they are put into your elasticsearch index.  This is to prevent unwanted and unpredictable behaviour.

For instance given the text bellow:

<a href=\"http://somedomain.com>\">website</a>

 

If the above is indexed without clean the html, a search for “somedomain” will match documents with the above link.  It might be what you want, but in most cases users do not.  So to prevent this you can use a custom analyser to clean your data.
Bellow is an example solution with cool techniques to debug and analyse your analyser such as query the actual data that is in your index. Note not the Elasticsearch document _source field which will always hold the true 100% raw data that hits elasticsearch unmodified.

Cleaning Elasticsearch Data

 

Create a new

Index with the required html_strip mapping filter configured

PUT /html_poc_v3
{
  "settings": {
    "analysis": {
      "analyzer": {
        "my_html_analyzer": {
          "type": "custom",
          "tokenizer": "standard",
          "char_filter": [
            "html_strip"
          ]
        }
      }
    }
  },
  "mappings": {
    "html_poc_type": {
      "properties": {
        "body": {
          "type": "string",
          "analyzer": "my_html_analyzer"
        },
        "description": {
          "type": "string",
          "analyzer": "standard"
        },
        "title": {
          "type": "string",
          "index_analyzer": "my_html_analyzer"
        },
        "urlTitle": {
          "type": "string"
        }
      }
    }
  }
}

 

 

Post Some Data

POST /html_poc_v3/html_poc_type/02
{
  "description": "Description &lt;p&gt;Some d&amp;eacute;j&amp;agrave; vu &lt;a href=\"http://somedomain.com&gt;\"&gt;website&lt;/a&gt;",
  "title": "Title &lt;p&gt;Some d&amp;eacute;j&amp;agrave; vu &lt;a href=\"http://somedomain.com&gt;\"&gt;website&lt;/a&gt;",
  "body": "Body &lt;p&gt;Some d&amp;eacute;j&amp;agrave; vu &lt;a href=\"http://somedomain.com&gt;\"&gt;website&lt;/a&gt;"
}

Now retrieve indexed data

This will by-pass the _source field and fetch the actual indexed data/tokens

GET /html_poc_v3/html_poc_type/_search?pretty=true
{
  "query": {
    "match_all": {}
  },
  "script_fields": {
    "title": {
      "script": "doc[field].values",
      "params": {
        "field": "title"
      }
    },
    "description": {
      "script": "doc[field].values",
      "params": {
        "field": "description"
      }
    },
    "body": {
      "script": "doc[field].values",
      "params": {
        "field": "body"
      }
    }
  }
}

 Example Response

 Note: the difference for title, description and body

{
  "took": 2,
   "timed_out": false,
   "_shards": {
      "total": 5,
      "successful": 5,
      "failed": 0
   },
   "hits": {
      "total": 1,
      "max_score": 1,
      "hits": [
         {
            "_index": "html_poc_v3",
            "_type": "html_poc_type",
            "_id": "02",
            "_score": 1,
            "fields": {
               "title": [
                  [
                     "Some",
                     "Title",
                     "déjà",
                     "vu",
                     "website"
                  ]
               ],
               "body": [
                  [
                     "Body",
                     "Some",
                     "déjà",
                     "vu",
                     "website"
                  ]
               ],
               "description": [
                  [
                     "a",
                     "agrave",
                     "d",
                     "description",
                     "eacute",
                     "href",
                     "http",
                     "j",
                     "p",
                     "some",
                     "somedomain.com",
                     "vu",
                     "website"
                  ]
               ]
            }
         }
      ]
   }
}

Further Cleaning Elasticsearch Data References:

Use this tool to test you analyser : elasticsearch-inquisitor

 

Missing logs in Elasticsearch logs at midnight

Case: of the Missing logs

I was debugging a curious case of my Elasticsearch instance on my vagrant dev box going to RED state every night at 00:00:00.  Consistently as far back as I can remember.

Right the obvious thing to do is look at the logs right? Except for this set of rotated logs there are no lines between 23:40hrs to 00:00:05.  Not in the current un-rotated log or the previous set.

At First Pass:

  1. Elasticsearch rotates its own log.  Could it be this process causing the missing Elasticsearch log lines?
  2. Marvel Creates new daily indices at 00:00:00.  Could it be this causing the missing Elasticsearch log lines?

What was the real was causing the missing logs

Well By default Elasticsearch uses log4j.  However, instead of the standard log4j.property file you get with log4j Elasticsearch is using a translated format to YAML format excluding all of the log4j pre-fix giveaways.  Another closer look at the configuration lead to the curious investigation of the type of rolling appender ; DailyRollingFile. This lead to this revelation :

DailyRollingFileAppender extends FileAppender so that the underlying file is rolled over at a user chosen frequency. DailyRollingFileAppender has been observed to exhibit synchronization issues and data loss. The log4j extras companion includes alternatives which should be considered for new deployments and which are discussed in the documentation for org.apache.log4j.rolling.RollingFileAppender.

Source :  Apache’s DailyRollingFileAppender Documentation

Missing Elastic logs Root Cause:

The sync issue with the DailyRollingFileAppender must be the cause to the missing Elasticsearch log lines around midnight.

Missing Elastic logs fix:

Use a log4j alternatives to DailyRollingFileAppender.  In this case a RollingFileAppender, changing my rolling strategy to roll my logs when they reach a certain file size. Replace DailyRollingFileAppender with RollingFileAppender and removing the  datePattern which was for the DailyRollingFileAppender.

Example:

file:
    type: rollingFile
    file: ${path.logs}/${cluster.name}.log
    maxFileSize: 10000000
    maxBackupIndex: 10
    layout:
        type: pattern
        conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"

Note: YAML is particular about tabs!

Happy Ending

Marvel turns out to be the cause of the my Elasticsearch cluster going into RED state at mid-night on new .Marvel*** Index creation.  Which makes sense as there will be a few milliseconds-seconds when this new index will have been created with shards, replicas etc missing.

Lean Maven Release

The Lean Maven Release (AKA Maven Release on Steroids)
Simply put to get rid of Maven Release Plugin’s repetitive and time wasting inefficient builds and multiple checkins to SCM script this process to:
mvn clean
mvn versions:set
mvn deploy
mvn scm:tag
This can be setup in both Jenkins and Team-city.  I have been able configured this within a few minutes replacing my teams Maven Release Plugin.  This is a huge time safer and more people really need to do this especially within true Continuous Delivery team or any type of setting with frequent need for builds.

Benefits

 So how big exactly was the improvement of Releases On Steroids over the Release Plugin?

See for yourself

Releases on Steroids Maven Release Plug-in
Clean/Compile/Test cycle 1 3
POM transformations 0 2
Commits 0 2
SCM revisons
1 (binary source tag)
3

See more at Axel Fontaine’s blog where I first came across this piece of treasure after manager tipped me about it.

 

A Typical Fix

Typically The following is all I ever need to add on any project to getting maven on steriods pattern working

Add the Properties

...
<maven.compiler.plugin.version>3.1</maven.compiler.plugin.version>
<maven.release.plugin.version>2.5</maven.release.plugin.version>
<maven.source.plugin.version>2.2.1</maven.source.plugin.version>
<maven.javadoc.plugin.version>2.9.1</maven.javadoc.plugin.version>
<maven.gpg.plugin.version>1.5</maven.gpg.plugin.version>
...

Deployment path settings

... Local deployment
<distributionManagement>
    <repository>
        <id>internal.repo</id>
        <name>Internal repo</name>
        <url>file:///${user.home}/.m2/repository/internal.local</url>
    </repository>
</distributionManagement>
...
... or Remote deployment
<distributionManagement>
    <repository>
      <uniqueVersion>false</uniqueVersion>
      <id>corp1</id>
      <name>Corporate Repository</name>
      <url>scp://repo/maven2</url>
      <layout>default</layout>
    </repository>
</distributionManagement>

Apache maven plugins

<pluinManagement>
<plugins>
...
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-compiler-plugin</artifactId>
    <version>${maven.compiler.plugin.version}</version>
</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-release-plugin</artifactId>
    <version>${maven.release.plugin.version}</version>
</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-source-plugin</artifactId>
    <version>${maven.source.plugin.version}</version>
</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-javadoc-plugin</artifactId>
    <version>${maven.javadoc.plugin.version}</version>
</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-gpg-plugin</artifactId>
    <version>${maven.gpg.plugin.version}</version>
</plugin>
...
<plugins>
...
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-release-plugin</artifactId>

    <configuration>
        <useReleaseProfile>false</useReleaseProfile>
        <releaseProfiles>release</releaseProfiles>
        <goals>deploy</goals>
    </configuration>
</plugin>
<plugin>
    <artifactId>maven-assembly-plugin</artifactId>
    <configuration>
        <outputDirectory>${project.build.directory}/releases/</outputDirectory>
        <descriptors>
            <descriptor>${basedir}/src/main/assemblies/plugin.xml</descriptor>
        </descriptors>
    </configuration>
    <executions>
        <execution>
            <phase>package</phase>
            <goals>
                <goal>single</goal>
            </goals>
        </execution>
    </executions>
</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-compiler-plugin</artifactId>
    <version>3.3</version>
    <configuration>
        <source>${java.version}</source>
        <target>${java.version}</target>
    </configuration>
</plugin>
<plugin>
    <artifactId>maven-clean-plugin</artifactId>
    <version>2.6.1</version>
    <configuration>
        <filesets>
            <fileset>
                <directory>overlays</directory>
                <includes>
                    <include>**/*</include>
                </includes>
                <followSymlinks>false</followSymlinks>
            </fileset>
        </filesets>
    </configuration>

</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-deploy-plugin</artifactId>
    <version>2.8.2</version>
</plugin>
<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-source-plugin</artifactId>
    <version>2.4</version>
</plugin>
...

The Release Profile

The Maven Release profile will be infered to for the maven deploy option

<profiles>
...
    <profile>
        <id>release</id>
        <properties>
            <activatedProperties>release</activatedProperties>
        </properties>
        <build>
            <pluginManagement>
                <plugins>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-compiler-plugin</artifactId>
                        <version>${maven.compiler.plugin.version}</version>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-release-plugin</artifactId>
                        <version>${maven.release.plugin.version}</version>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-source-plugin</artifactId>
                        <version>${maven.source.plugin.version}</version>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-javadoc-plugin</artifactId>
                        <version>${maven.javadoc.plugin.version}</version>
                    </plugin>
                    <plugin>
                        <groupId>org.apache.maven.plugins</groupId>
                        <artifactId>maven-gpg-plugin</artifactId>
                        <version>${maven.gpg.plugin.version}</version>
                    </plugin>
                </plugins>
            </pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-source-plugin</artifactId>
                    <executions>
                        <execution>
                            <id>attach-sources</id>
                            <goals>
                                <goal>jar</goal>
                            </goals>
                        </execution>
                    </executions>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-javadoc-plugin</artifactId>
                    <executions>
                        <execution>
                            <id>attach-javadocs</id>
                            <goals>
                                <goal>jar</goal>
                            </goals>
                        </execution>
                    </executions>
                </plugin>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-gpg-plugin</artifactId>
                    <executions>
                        <execution>
                            <id>sign-artifacts</id>
                            <phase>verify</phase>
                            <goals>
                                <goal>sign</goal>
                            </goals>
                        </execution>
                    </executions>
                </plugin>
            </plugins>
        </build>
    </profile>
</profiles>
...


Assemly description

And the plugin assemly description xml.  Location: src/main/assemblies/plugin.xml

<?xml version="1.0" encoding="UTF-8"?>
<assembly>
    <id>plugin</id>
    <formats>
        <format>zip</format>
    </formats>
    <includeBaseDirectory>false</includeBaseDirectory>
    <dependencySets>
        <dependencySet>
            <outputDirectory>/</outputDirectory>
            <useProjectArtifact>true</useProjectArtifact>
            <useTransitiveFiltering>true</useTransitiveFiltering>
            <excludes>
            </excludes>
        </dependencySet>
    </dependencySets>
</assembly>

Note if you intend to sign the package with the GPG plugin you’ll need to further configure this for your environment. I will write a seperate blog for this later. You may skip it with the bellow:

mvn deploy -Prelease -Dgpg.skip=true

Thats It!

Intelli j Tweaks

Hot deploy/swap to Servlet Server

  1. File –> Settings –> Debugger –> HotSwap
    1. enable class reload classed on the background : true
    2. enable class reload classed after compilations : always
  2. Run/Debug Configurations 
    1. select “Update resources” in the drop-down “On frame deactivation”.

More here

 

Git Alias Configuration Example

Edit your .gitconfig file in your $HOME directory for some serious time saving git shortcuts

[core]
 excludesfile = /Users/moses.mansaray/.gitignore_global
 autocrlf = input

[user]
 name = moses.mansaray
 email = moses.mansaray@domain.com

[push]
 default = simple

[alias]
 co = checkout
 cob = checkout -b
 cod = checkout develop
 ci = commit
 st = status

 save = !git add -A &amp;&amp; git commit -m
 br = branch

 rhhard-1 = reset --hard HEAD~1
 rhhard-o = reset head --hard

 hist = log --pretty=format:\"%h %ad | %s%d [%an]\" --graph --date=short

 type = cat-file -t
 dump = cat-file -p

 llf = log --pretty=format:"%C(yellow)%h%Cred%d\\ %Creset%s%Cblue\\ [%cn]" --decorate --numstat
 lld = log --pretty=format:"%C(yellow)%h\\ %ad%Cred%d\\ %Creset%s%Cblue\\ [%cn]" --decorate --date=short

 amend = commit -a --amend

ElasticSearch Curator Short Guide

Elasticsearch Curate Features

Curate, or manage your Elasticsearch indices
  • Alias Management – add, remove
  • Shard routing allocation
  • Indices Management – Close , Delete indices, Open closed indices, Optimize indices and modify number of replicas
  • Snapshot(backups management) – Show, Backup, Restore.
  • Change the number of replicas per shard for indices
  • Pattern Matching for statements (e.g delete all indices matching .marvel*)

 

Example Elasticsearch Curate Use Cases

Automatically/manually

  • Snapshots Index creation
    • All indices except Marvels regex support
  • Restore from snapshots (using the Elasticsearch end point via curl ….. Not with curator currently)
    • All indices in the back up
    • Specific index from the backup
    • Keep cluster state as is
  • Delete indices
    • Older than a date range
    • By a regex matching pattern

Background

  • Started off as clearESindices.py for a simple goal to delete indices.
  • It then became logstash_index_cleaner.py
  • Then moved under logstash repository as “expire_logs”.
  • After Jordan Sissel was hired by Elastic it then became Elasticsearch Curator and is now hosted at https://github.com/elastic/curator
  • Today : Curator now performs many operations on your Elasticsearch indices, from delete to snapshot to shard allocation routing.

 

Curator all-in Command Line

  • Curator 2.0 underlying feature was the first attempt to detach the popular Elasticsearch API from the CLI.
  • These allows scripting to carry out any of the tasks featured above meaning you can for instance:
    • Automate your back up and restore procedures.
  • Also with version 2.0, Curator ships in an API along side the wrapper scripts/ entry points. This API allows you to roll out your own scripts to perform similar or totally different tasks to Curator with the same underlying code that curator uses…
  • Documentation can be found here

 

Curator installation for Mac


# you may ignore the python installations if you already have this
wget https://bootstrap.pypa.io/get-pip.py
python get-pip.py

wget https://bootstrap.pypa.io/get-pip.py
sudo python get-pip.py

wget https://pypi.python.org/packages/source/u/urllib3/urllib3-1.8.3.tar.gz
pip install urllib3-1.8.3.tar.gz

wget https://pypi.python.org/packages/source/c/click/click-3.3.tar.gz -O click-3.3.tar.gz
sudo pip install click-3.3.tar.gz

# install elasticsearch-py
wget https://github.com/elastic/elasticsearch-py/archive/1.6.0.tar.gz -O elasticsearch-py.tar.gz
sudo pip install elasticsearch-py.tar.gz

# install elasticsearch-curator
wget https://github.com/elastic/curator/archive/v3.3.0.tar.gz -O elasticsearch-curator.tar.gz
sudo pip install elasticsearch-curator.tar.gz

# To test : Verify version
curator --version
# should echo: "curator, version 3.3.0"

# To example Commands try the help menu
curator --help
# should echo : help menu options

For more installation guides please see the official Elasticsearch Guides

 

Curator Command Line Flags

 

Index and Snapshot Selection

–newer-than

–older-than

–prefix

–suffix

–time-unit

–timestring

–regex

–exclude

Index selection only

–index

–all-indices

Snapshot selection only

–snapshot

–all-snapshots

–repository