Coursera Full Stack Web Development Capstone Project

A couple of days ago I finished my capstone project for the Full Stack Web Development specialization on Coursera. It marks the end of the 6-course specialization about Bootstrap CSS, AngularJS and NodeJS.

The Assignment

The assignment itself is simple:

Build a web application with the tools taught in the course (Bootstrap CSS, AngularJS, NodeJS, MongoDB).

Everybody has the free choice what to implement, however it should be a small app as it must be implemented during a short period of time. I decided to implement a message board where users can create boards and post messages (more below).

The Schedule

The capstone project takes 8 weeks in total, while only 2 or 3 weeks are dedicated to actual coding. Each weeks has about 3 hours of workload. So you can spend approximately 9 hours to coding – but you will definitely spend more time!

Week Topic Assignment
1 Ideation
2 Ideation Report Report (PDF, 2 pages) about your idea
3 UI Design and Prototyping
4 UI Design and Prototyping Report Report (PDF, 2 pages) with UI mockups
5 Architecture Design and Software Structure
6 Architecture Design and Software Structure Report Report (PDF, 3 pages) with architecture, structure, REST URLs, data model
7 Project Implementation and Final Report
8 Final Submission and Report Report (PDF, 2 pages), code on GitHub, running app on IBM Bluemix

The Workload

You will spend most of your time for coding, but there will also be a huge amount of time you will spend on organisational tasks. Those tasks might take a lot of time:

  • Making a GitHub project including, .gitignore and so on
  • Setup a Travis CI build
  • Deploy to IBM Bluemix (read tutorials, write a .cfignore and manifest.yml)
  • Do all dependency management
  • Install and configure MongoDB
  • Manage different configs for you local development and IBM Bluemix

While all of those tasks are absolutely necessary for every project setup, they can easily take up some hours nevertheless. This melts down your coding time.

My Project

I called my project MEBO which simply stands for Message Board. It’s a small application where users can create boards and post messages on them. Every who know the link to a board can access it and edit all messages. There’s no login or so. You can find the project on GitHub:

Lessons Learned

  • Make it small! Pf course, everybody wants to make this super fancy application with its own user management, administration view and all kinds of features. But make your life easy and do what’s needed to demonstrate your skills.
  • Start early with the deployment and integration to IBM Bluemix. No matter how well your project is implemented, if it doesn’t run on IBM Bluemix (or anywhere else) where I can see it, you will not pass the assignment. Get things up and running from the beginning.
  • Let it looks nice. The first impression is very important. If your project looks nice and your GitHub project has a, your’re on the right path.

Best regards,

Coursera Full Stack Web Development Course Review

During the last 6 months I did the Full Stack Web Development course on Coursera. Since I’m currently about to finish the course by implementing my final capstone project (, I wanted to share my thoughts about the course and its pros and cons.

About the course

The Full Stack Web Development course consists of 6 single courses. Altogether, they make up the complete course which Coursera calls a specialization.

  1. HTML, CSS and JavaScript (3 weeks)
  2. Front-End Web UI Frameworks and Tools (4 weeks)
  3. Front-End JavaScript Frameworks: AngularJS (4 weeks)
  4. Multiplatform Mobile App Development with Web Technologies (4 weeks)
  5. Server-side Development with NodeJS (4 weeks)
  6. Full Stack Web Development Specialization Capstone Project (8 weeks)

It’s possible to take single courses, but of course it’s recommended to take all of them and do them one after another in the given order.

The complete specialization takes 27 weeks and costs 70 € per course, so 420 € in total.

HTML, CSS and JavaScript

This course thoughts the basics about HTML, CSS and JavaScript. It was made for beginners, so if you have a little knowledge of those topics you will probably be board. On the other side, if you have absolutely no knowledge about this HMLT, SSC and LavaScipting, you will have a hard time to learn everything in just 3 weeks. Because that’s how long the course will take.

IMHO: I don’t recommend this one. If you know HTML, CSS and Javascript you won’t learn anything from the course. And if you are absolutely new to those technologies, the course is far too short.

Front-End Web UI Frameworks and Tools

This course focuses on Bootstrap CSS. Although Bootstrap CSS isn’t too complicated, a lot of people don’t understand the principles behind it (e.g. the grid system with its rows and columns). So even if you used Bootstrap CSS before, this course will help you to understand things better.

IMHO: I recommend this one as it really helps to better understand one of the most popular UI frameworks right now.

Front-End JavaScript Frameworks: AngularJS

What the second course was for Bootstrap CSS, this one is for AngularJS. And again, if you know AngularJS there will be nothing new to you. But if you are new to AngularJS this course will find the right tempo to give you a good first glance. However, the course is not really up-to-date with AngularJS’ latest version.

IMHO: I recommend it, as this course gives you a good introduction to AngularJS.

Multiplatform Mobile App Development with Web Technologies

This course is made up around the Ionic Framework for mobile apps. If you are interested in building mobile apps, this course will teach you one out of a thousand possibilities to do this. I think the Ionic Framework is not the worst way to make mobile apps, however the course is still very opinionated and has a very small focus.

IMHO: Out of all 6 courses, I can recommend this one at least. Ionic might be good for some use cases, but the example app made in the course doesn’t benefit from it in any way. It’s just the very same app as made before. Making everything responsive would be a much better approach. This course also relies heavily on installed software like Ionic itself and Android or iOS simulators. If one of those pieces of software don’t run on your device, you are screwed. It took me hours to get the Android simulator to run, before I switched to the iOS simulator which also took me hours.

Server-side Development with NodeJS

This course is about NodeJS and MongoDB. Although the architecture of the example app of the course is terrible (they make database queries in the REST controllers!), the course gives a nice introduction to server-side JavaScript. Both technologies – NodeJS and MongoDB – are state of the art.

IMHO: I can recommend this one to get started with NodeJS and MongoDB, but also to see some draw backs of those very hyped technologies.

Full Stack Web Development Specialization Capstone Project

At the end of the course, everybody must implement a final project. The project should show the learned skills and should – of course – use technologies thought in the course. So you are forced to get your hands dirty and write some own code.

IMHO: This part of the specialization is very interesting, but has some pitfalls, too. It’s important to choose the project idea wisely. The course only takes 8 weeks from which only 2 are dedicated to actual programming. So whatever you implement, it must be something small.

Special note: To complete the course, you must deploy your project to IBM’s PaaS Bluemix for which you will get a test account. As I worked with AWS and some other DevOp technologies before, this wasn’t too hard for me to do. However, Bluemix is a terrible plattform. If you are not familiar with PaaS, plan some extra time to get things working. The course will not prepare you for that in any way.

How Coursera works

Before you take a course at Coursera, you should first understand how the plattform works: Courses on Coursera are mode up of online videos, text and PDFs, exercises and assignments. It’s up to the teacher of the course is laid out.

Every course runs regularly at some specific date and will end at some specific date. This means a course might start every 2 months beginning at the first of the month and ends 4 weeks later. You must (!) take the course at this period of time. It’s just like a physical class you would take at school or university.

Most courses require assignments to complete them. This means there will be some exercise at the end you must fulfil and upload the solution. Most assignments are peer-graded which means that you must review your classmates and you will be reviewed yourself by them.

At the end, you will get a certificate with a lot of buzzwords for this specific course.

IMHO: Coursera is nice, but it’s not the same as a real physical class at university. Especially the peer-graded assignments are problem. Some people tend to criticize the most odd things, while others just give you the point without even looking at your work. It’s completely up to you how serious you take it.

What I learned

Things I liked to learn

Things I didn’t like after learning them

What I missed

What I missed completely during all 6 courses was unit testing. None of the courses teach anything about testing, neither in the front end (Jasmine, Protractor) nor in the backend (Mocha, Sinon).


Best regards,

AngularJS Intensive Workshop by Robin Böhm/Symetics GmbH

Last week I attended an AngularJS Intensive Workshop by Robin Böhm/Symetics GmbH. Symetics is the company behind and provides consulting and training all around JavaScript, AngularJS (1/2) and TypeScript.


The location

The 3-days workshop was located in the UNPERFEKT Haus in Essen. The UNPERFEKT Haus is an open space for artists, students and musicians and provides rooms, WLAN and a lot of creative environment. It was a really great location for the seminar and we all felt comfortable.

The trainer

Our trainer was Robin Böhm who is the CEO/founder of Symetics GmbH. He works as a freelancer and does consulting, in-house training and workshops all around web technologies like AngularJS. During the 3-days he provided a lot of information and made a really intense workshop. However, the atmosphere was always relaxed and Robin was open for discussions and questions – even during the lunch break.

The workshop

The workshop was divided into two parts: (1) a round trip about JavaScript and (2) a dive-deep on AngularJS. It was a good mix of presentations and hands-on sessions on a prepared sample project. Robin managed to cover an huge amount of topics:

  • components vs. directives
  • routing (with the new router from AngularJS 1.5)
  • testing (with Jasmine)
  • the promise API
  • build tools
  • books to continue

Robin also tried to give us an outlook on AngularJS 2 and how we can prepare our current projects for it.

My opinion

Actually, there wasn’t much I didn’t like. The workshop covered all relevant topics and gave us a lot of useful insights into AngularJS and related stuff.

I can really recommend it!

It was fast, well prepared and very practical (we wrote code for about 1/3 of the time). The group was medium-sized, maybe 14 people (mostly Java developers). The only downside was that Robin sometimes provided to much detail on a particular topic. I’m more comfortable with staying by the underlying concepts of a technology and dive into the nitty gritty details when I’m working on it.

I’m looking forward to the next workshop! Thanks!



Anti Patterns

A couple of days ago I have been to a tech talk from eBay in Berlin. The talk was hold by a developer of the eBay Kleinanzeigen team (a subsection of eBay in Germany) who talked about development patterns his team learned during the years. He mentioned behavioral patterns, design patterns and coding pattern his colleagues and he discovered as good practices for their projects. Since nearly everything could be seen as a pattern, the talk was a mix of learned lessons and actual advises on how to structure code. Cool! It is always good to talk and hear about coding experiences and it also reminded me at something which is often more important than patterns: anti patterns!

2015-01-22 19.27.11

The idea of anti patterns always comes to my mind when I’m reading some old code. And it comes more often if the code is super sophisticated and fancy. The more people try to engineer their code, the more they build things which looks like some complicated pattern but are actually anti patterns.

You like some examples? Here’s a very trivial anti pattern, not a big thing. It’s just a bad habit. It looks like that:

Anything wrong here? No, just a factory. Anything useless? Yes, a little bit. It’s very likely to see a class with only static methods (like a factory) to have a private constructor. Why? Because the developer thought it’s evil to instantiate it and he must prevent it for the safety of everybody in the room! So he creates something which look like a singleton (but isn’t). He does this in every suitable class, because it’s his patterns. But it is actually nothing. What would happen if you instantiate such an object? Nothing. And if you call static methods on that object? Nothing, too. It doesn’t look good, but it would work and your IDE would warn you. So what problem does it solve? There’s none. It just looks like some useful pattern, but actually does nothing but clutter your code. An anti pattern.

There are more of those small bad habits (like using final everywhere) but also bigger ones (like an interface for everything). I’m going to think about this a little more and come up with another example!

Best regards,

Hello Berlin!

2014-10-03 17.53.23

The last months have been an interesting and busy time for me (that’s why I didn’t write as much as I had planed, but I already got some new drafts in my WordPress!). After I finished my master’s thesis at Informatica in Stuttgart (greetings to the whole team!), I moved to Berlin for my first “real” job. I started at a company called Westernacher Products and Services, which does a lot of web applications and consulting for juristic institutions in Germany. Based on an up-to-date technology stack with Spring, Gradle, ExtJS and AngularJS, Westernacher implements document management and all kinds of business applications. Being part of the development team I am looking forward to a bunch of interesting projects and to a lot of new stuff to learn. Nice to be here!

Best regards,

DeployMan (command line tool to deploy Docker images to AWS)


2014-07-29 11_34_11-Java EE - Eclipse

Yesterday, I published a tool called DeployMan on GitHub. DeployMan is a command line tool to deploy Docker images to AWS and was the software prototype for my master thesis. I wrote my thesis at Informatica in Stuttgart-Weilimdorf, so first of all, I want to say thank you to Thomas Kasemir for the opportunity to put this online!


At the time I am writing this post, DeployMan is a pure prototype. It was created for academic research and as a demo for my thesis. It is not ready ready for production. If you need a solid tool to deploy Docker images (to AWS), have a look at Puppet, CloudFormation (for AWS), Terraform, Vagrant, fig (for Docker) or any other orchestration tool that came up in the last couple of years.

What DeployMan does

DeployMan can create new AWS EC2 instances and deploy a predefined stack of Docker images on it. To do so, DeployMan takes a configuration file called a formation. A formation specifies how the EC2 machine should look like and which Docker images (and which configurations) should be deployed. Docker images can either be deployed from a Docker registry (the public one or a private) or a tarballs from a S3 storage. Together with each image, a configuration folder will pulled from a S3 storage and mounted to the running container.

Here is an example of a formation which deploys a Nginx server with a static HTML page:


DeployMan provides a command line interface to start instances and do some basic monitoring of the deployment process. Here is a screenshot which shows some formations (which can be started) and the output of a started Logstash server:


To keep track of the deployment process in a more pleasant way, DeployMan has a web interface. The web interface shows details to machines, such as the deployed images and which containers are running. Here is how a Logstash server would look like:


The project


You can find the project on GitHub at I wrote a detailed which explains how to build and use DeployMan. To test DeployMan, you need an AWS account (there are also free accounts).

The project is made with Java 8, Maven, the AWS Java API, the Docker Java API and a lot of small stuff like Apache Commons. The web interface is based on Spark (for the server), Google’s AngularJS and Twitter’s Bootstrap CSS.

Best regards,

Presentation of my master thesis

Over the last six months, I wrote my master thesis about porting an enterprise OSGi application to a PaaS. Last Monday, the 21th Juli 2014, I presented the main results of my thesis to my professor (best greetings to you, Mr. Goik!) and to my colleges (thanks to all of you!) at Informatica in Stuttgart-Weilimdorf, Germany (where I had written my thesis based on one of their product information management applications, called Informatica PIM).

Here are the slides of my presentation.

While my master thesis also covers topics like OSGi, VMs and JEE application servers, the presentation focuses on my final solution of a deployment process for the cloud. Based on Docker, the complete software stack used for the Informatica PIM server was packaged into separate, self-contained images. Those images have been stored in a repository and were used to automatically setup cloud instances on Amazon Web Services (AWS).

The presentation gives answers to the following questions:

  • What is cloud computing and what is AWS?
  • What are containers and what is Docker?
  • How can we deploy containers?

To automate the deployment process of Docker images, I implemented my own little tool called DeployMan. It will show up at the end of my slides and I will write about it in a couple of days here. Although there are a lot of tools out there to automate Docker deployments (e.g. fig or Maestro), I wanted to do my own experiments and to create a prototype for my thesis.


Best regards,

The thesis writing toolbox

I spend the last weeks (or months?) at my desk writing my Master thesis about OSGi, PaaS and Docker. After my Bachelor thesis two years ago, this was my second big “literary work”. And as any computer scientist, I love tools! So here is my toolbox I used to write my thesis.


2014-06-20 14_07_02-Program Manager

I did most of my work with LyX, a WYSIWYM (what you see is what you mean) editor for LaTeX. LaTeX is a document markup language written by Leslie Lamport in 1984 for the type setting system TeX (by Donald E. Knuth). So what does this mean? It means that TeX is the foundation for LaTeX (and LyX). It gives you the ability to set text on paper, print words italic or leave space between two lines. But it is pretty raw and hard to handle. LaTeX is a collection of useful functions (called macros) to make working with TeX more pleasant. Instead of marking a certain word in bold and underline it, you could just say it is a headline and LaTeX will do the rest for you. A simple LateX document would look like this:

By choosing the document class article, LaTeX will automatically render your text to A4 portrait paper with 10 pt font size and so on. You will not have to worry about the layout, just the content. This is how the above code will look like as a PDF:

2014-06-20 14_54_57-Namenlos-1.pdf - TeXworks

The main difference between writing your document in LaTeX instead of (e.g.) Microsoft Word, is that you do not mix content and styling. If you write with Microsoft Word, you will always see your document as the final result (e.g. a PDF) would look like. The Microsoft Word document will look the same as the PDF. This principal is called WYSIWYG (what you see is what you get). In LaTeX however, you will only see your raw content until you press the compile button to generate a PDF. The styling is separated from the content and only applied in the last final step when generating the PDF. This is useful, because you do not have to fight with your formatting all the time – but you have to write some pretty ugly markup.

This is where LyX comes into play. LyX is an editor to work with LaTeX, without writing a single line of code. It follows the WYSIWYM (what you see is what you mean) principal. This means you will see your document not in its final form, but as you mean it. Headlines will be big, bold words will be bold and italic word will be italic. Just as you mean it. However, the final styling will come in the end when generating the PDF. The LyX screenshot from above will look like this as a PDF:

2014-06-20 15_16_12-thesis_de.pdf - Adobe Reader


An important part of every thesis is literature. To organize my literature I use a tool called JabRef. Its website looks very ugly, but the tool itself is really cool. It lets you generate a library with all your books and articles you want to use as references. This is my personal library for my Master thesis (with 53 entries in total):

2014-06-20 15_25_13-Program Manager

JabRef will generate and organize as so called BibTex file. This is a simple text file, where every book has its own entry:

Every book or article I read will get its own entry in this file. I can make new entries with JabRef itself or with generators like I can link my library to my LyX document, so every time I want to make a new reference or citation, I just open a dialog and pick the according reference:

2014-06-20 15_31_51-LyX_ ~_Dropbox_MA - SS14 - Master Arbeit_thesis_de.lyx

LyX will automatically create the reference/citation and the according entry in the bibliography:

2014-06-20 15_34_29-thesis_de.pdf - Adobe Reader


When you write a large document such as a thesis, you will probably make mistakes. Your university wants to have some changes and you will improve your document until its final version. However, a lot of people will not read it twice. Instead, they read it once, give you some advice and want to see the changes again. With la­texd­iff you can compare two LaTeX documents and create a document visualizing the changes. Here is an example from the PDF page shown above:

2014-06-20 15_41_35-diff.pdf - Adobe Reader

As you can see, I changed a word, inserted a comma and corrected a typo.

A great tutorial about la­texd­iff can be found at

Google Drive

To make graphics I use Google Drive. It is free, it is online and it is very easy. The feature I like most on Google Drive is the grid. You can align objects to each other so your drawing will look straight.

2014-06-20 15_54_26-Untitled drawing - Google Drawings


If you loose your thesis you are screwed! Since LyX/LaTeX documents are pure text, you could easily put it into a version control system such as Git or SVN. However, you will probably use some graphics, maybe some other PDF and stuff like that. To organize this, I simple use Dropbox. Not only will Dropbox save your files for you, it also has a history. So you can easily restore a previous version of your document:

2014-06-20 15_49_38-Revisionen - Dropbox

eLyXer – HTML export


eLyXer is a tool to export LyX documents as HTML. Although, LyX is meant to create PDF documents in the first place, it is nice to have a HTML version to. eLyXer is already build in to LyX, so you can just export your document with some clicks:

2014-07-02 11_19_37-LyX_ ~_Dropbox_MA - SS14 - Master Arbeit_thesis_de.lyx

Here is the CSS I used to style the HTML output:

The result looks like this:

2014-07-02 11_24_13-Portierung einer Enterprise OSGi Anwendung auf eine PaaS

The thesis writing toolbox

LyX A WYSIWYM editor for LaTeX to write documents without a single line of code.
JabRef An organizer for literature.
latexdiff Compares two LaTeX documents and generates a visualization of their differences.
eLyXer Exports LyX documents to HTML

Best regards,

Who is using OSGi?

Who is using OSGi? If you take a look you will see more than a hundred members of the the OSGi Alliance and a lot of big players just like IBM or Oracle. But lets do a little investigation with Google Trends on our own.

Google Trends

Google Trends is a service where you can search for a particular key word and get a time line. The time line shows you when the key word was search and how many requests have been made. That is a great way to estimate how popular a certain technology is at a time. Google will also show you where the search requests have been made – and that is where we start.

OSGi in Google Trends

If we search for “OSGi” on Google Trends we will see a chart just like the one shown below. As we can see, OSGi is somehow over its peak and the interest in the technology is decreasing since a couple of years.

But even more interesting is the map which shows where the search requests have been made. As we can see, most requests came from China. On the fourth place is Germany.

If we take a closer look on Germany, we see that most requests come from the south of Germany.

But we can even get more specific and view the exact cities. It is a little bit hard to see in the chart, but you can click on the link at the bottom to see the full report. You will see Walldorf, Karlsruhe and Stuttgart on top. So what? Well, in Walldorf, there is one big player who is not on the list of the OSGi Alliance: SAP.

We can do the very same with the USA and we will end up in California and Redwood City, where companies like Oracle and Informatica are located.

Best regards,

Development speed, the Docker remote API and a pattern of frustration

One of the challenges Docker is facing right now, is its own development speed. Since its initial release in January 2013, there have been over 7.000 commits (in one year!) by more than 400 contributors. There are more than 1.800 forks on GitHub and Dockers brings up approximately one new release per month. Docker is in a super fast development right now and this is really great to see!

However, this very high development speed leaves a lot of third-party tools behind. If you develop a tool for Docker, you have to keep a very high pace. If not, your tool is outdated within a month.

Docker remote API client libraries

A good example how this development speed affects projects, are the remote API libraries for Docker. Docker offers a JSON API to access Docker in a programmatic way. It enables you for example to list all running containers and stop a specific one. All via JSON and HTTP requests.

To use this JSON API in a convenient way, people created bindings for their favorite programming language. As you can see below, there exist bindings for JavaScript, Ruby, Java and many more. I used some of them on my own and I am really thankful for the great work their developers have done!

But many of those libraries are outdated at the time I am writing this. To be exact: all of them are outdated! The current remote API version of Docker is v1.11 (see here for more) which none of the remote API libraries supports right now. Many of them don’t even support v1.10 or v1.9.

Here is the list of remote API tools as you find it at

Language Name Remote API
Python docker-py v1.9
Ruby docker-api v1.10
JavaScript (NodeJS) dockerode v1.10
JavaScript (NodeJS) v1.7
JavaScript (Angular) WebUI dockerui v1.8
Java docker-java v1.8
Erlang erldocker v1.4
Go dockerclient v1.10
PHP Docker-PHP v1.9
Scala reactive-docker v1.10

How to deal with rapidly evolving APIs

How to deal with rapidly evolving APIs is a difficult question and IMHO Docker made the right decision. By solely providing a JSON API Docker chose a modern and universal technique. A JSON API can be used in any language or even in a web browser. JSON (together with a RESTful API) is the state-of-the-art technique to interact with services. Docker even leaves the possibility to fall back to an old API version by adding an version identifier to the request. Well done.

But the decision to stay “universal” (by solely providing a JSON API) also means to don’t get specific. Getting specific (which means to use Docker in a certain programming language) is left to the developers of third party tools. These tools are also evolving rapidly right now, no matter if those are remote API bindings, deployment tools (like, or hosting solutions (like CoreOS). This enriches the Docker ecosystem and makes the project even more interesting.

Bad third party tools will fall back on you

The problem is, even if Docker made a good job (which they did!), outdated or poorly implemented third party tools will fall back on Docker, too. If you use a third party library (which you maybe found via the official website) and it works fine, you will be happy with Docker and the third party library. But if the library doesn’t work next month because you updated Docker and the library doesn’t take care of the API version, you will be frustrated about the tool and about Docker.

Pattern of frustration

This pattern of frustration occurs a lot in software development. Bad libraries cause frustrations about the tool itself. Let’s take Java as an example. A lot of people complain about Java that it is verbose, uses class-explosions as a pattern and makes things much more complicated as they should be. The famous AbstractSingletonProxyFactoryBean class of the Spring framework is just such an example (see +Paul Lewis). Another example is reading a file in Java which was an awful pain:

And even the new NIO API which came with Java 7 is not as easy as it could be:

You need to put a String into a Path to pass it into static method which output you need to put into a String again. Great idea! But what about something like this:

However, it is not the fault of Java, but of a poorly implemented third party tool. If you need to put a File into a FileReader which you need to put into a BufferedReader to be able to read a file line by line into a StringBuilder you use a terrible I/O library! But anyway, you will be frustrated about Java and how verbose it is (and maybe also about the API itself).

This pattern applies to many other things: You are angry about your smartphone, because of a poorly coded app. You are angry about Eclipse because it crashes with a newly installed plugin. And so on…

I hope this pattern of frustration will not apply to Docker and the community will develop a stable ecosystem of tools to provide a solid basis for development and deployment with Docker. A tool like Dockers lives trough its ecosystem. If the tools are buggy or outdated, people will be frustrated about Docker – and that would be a shame, because Docker is really great!

Best regards,