Friday, March 2, 2018

Release 0.2.4: Attributes

This release increases quality, it provides a new streaming tool and it introduces attributes. Last but not least, a new scriping host was added to the supported list: Nashorn.

New version

Released right on time, here comes the new version:

Release content

Leftovers on gpf.require.define

When writing the article My own require implementation, the handling of JavaScript modules was split in two distinct parts:

  • CommonJS handling (mostly because of synchronous requires)
  • AMD / GPF handling (because asynchronous)

Furthermore, 'simple' CommonJS modules (i.e. no require with just exports) were not supported.

Those problems were addressed by fixing the issue #219.

Quality focus

A significant effort was put in quality:

This is all explained in the article Travis integration.

GPF-JS quality
GPF-JS quality

Nashorn support

Nashorn is another JavaScript engine implemented in Java. It's like Rhino's little brother: it's faster and delivered with Java Runtime Environment.

This new Java host made me realize that the gpf.rhino namespace was badly named. it has been deprecated it to the benefit of the gpf.java namespace.

Filtering stream

Previous release introduced gpf.stream.pipe to chain streams together.

This release delivers a filtering processor that forwards or blocks data based on a filtering function. Here is a modified version of the previous release's sample that includes filtering:

// Reading a CSV file and keep only some records var csvFile = gpf.fs.getFileStorage() .openTextStream("file.csv", gpf.fs.openFor.reading), lineAdapter = new gpf.stream.LineAdapter(), csvParser = new gpf.stream.csv.Parser(), filter = new gpf.stream.Filter(function (record) { return record.FIELD === "expected value"; }), output = new gpf.stream.WritableArray(); // csvFile -> lineAdapter -> csvParser -> filter -> output gpf.stream.pipe(csvFile, lineAdapter, csvParser, filter, output) .then(function () { return output.toArray(); }) .then(function (records) { // process records });

Attributes

Back to 2016, an article to introduce GPF-JS was written and it explains the concepts that were expected in the library: classes, interfaces and attributes.

The last pillar is introduced in this release.

It all starts by its simpler aspects:

More features will come soon but this rewriting is very exciting as it fixes all the issues of the initial draft (developed in 2015).

Improved development environment

It is now possible to use an authentication token to connect to the GitHub platform (previously user and password were required). It opens the door for a new tile in the dashboard to speed up development.

This is already enabled in the release process which has also been improved to prepare the following version in its final step. Unfortunately, the command line failed when testing it on this release and the remaining steps were finished manually

...which demonstrates how much value automation brings in this procedure...

Lessons learned

If it's not tested, it doesn't work. This was confirmed with Travis integration where the tools were run for the first time on a non-Windows environment.

The build process is taking longer, for two reasons:

  • There is a new host to test (Nashorn)
  • The library now has 9 legacy test files

A strategy will be decided to reduce the number of tests being run.

With the integration of Deepscan.io, a lot of cleaning has been applied because there is no easy way to exclude specific files from the tool (other than naming them one by one). All the unused files were moved from src & test to the new lost+found folder.

Next release

The library becoming more complex (and bigger), it might evolve to propose several 'flavors'. The goal would be to provide smaller / dedicated versions that cut in the feature set.

For instance, a modern browser version could see the file system management removed as well as the compatibility layer.

Today, the dependencies.json file enumerates module dependencies. To be able to isolate them, a more accurate view is required and some modifications must be done inside the sources.json file.

The next release content will mostly focus on:

  • Documenting attributes
  • Refining the WScript simulation
  • Improving the development framework (illustrate the versions)
  • Adding a new stream helper to transform records

Thursday, March 1, 2018

Travis integration


This article summarizes how Travis CI was plugged into the GPF-JS GitHub repository to assess quality on every push.

Quality control

Since the beginning of the GPF-JS project, quality is a high priority topic.

Over the years, a complete development and testing framework has been built:

  • Development follows TDD principles
  • The code is linted with eslint (also with jshint, even if it is redundant)
  • The maintainability ratio is computed using plato and verified through custom threshold values
  • Testing is done with all supported hosts
  • The coverage level is measured and controlled with custom threshold values
  • A web dashboard displays the metrics and gives access to the tools

However, these validations were done locally and they had to be run manually before pushing the code.

To enable collaboration and ensure that all pull requests would go to the same validation process, an continuous integration platform was required.

Another approach consists in creating git hooks to force the quality assessment before pushing. However, it requires the submitting platform to be properly configured. Having a central place where all the tests are being run guarantees the correct and unbiased execution of the tests.

Travis CI

Travis-CI is a free solution that integrates smoothly with GitHub. Actually, the GitHub credentials are used to login to the platform.

After authentication, all the repositories are made available to Travis and each one can be enabled individually.

Documented steps
Documented steps

The selected repositories must be configured to instruct Travis on how the builds are done. To put it in a nutshell, a new file (.travis.yml) must be added.

Once everything is put together, the platform will monitor pushes and trigger builds automatically. Eventually, it documents the push in GitHub with a flag indicating if the build succeeded.

Successful builds
Successful builds

When opening the Travis website, the user is welcomed with a nice dashboard listing the enabled repositories with their respective status.

Travis dashboard
Travis dashboard
.

The project README can be modified to show the last build result using an image delivered by the Travis infrastructure:

Changes in GPF-JS

.travis.yml

The .yml extension stands for YAML file: a structured format that is commonly used for configuration files. In the context of Travis, it describes the requirements and steps to build and validate the project.

Requirements

One obvious requirement of the GPF-JS development environment is NodeJS. This is configured by defining the language setting and selecting the Long Term Support version.

Chrome is also required to enable browser testing.

Grunt is installed using the proper command line.

The remaining requirements are described by the project dependencies and are installed through NPM.

Steps

The default step being executed by Travis is npm test

The resulting command line is specified in the package.json file; it executes grunt with the following tasks:

  • connectIf to setup a local web server necessary for browser and gpf.http testing
  • concurrent:source to execute all tests with the source version (see library testing tutorial)

But, first, the GPF-JS development environment requires a configuration file to be generated: it contains the list of detected hosts, the metrics thresholds and the HTTP port to be used to serve files.

Configuration

The configuration file is interactively created the very first time the grunt command is being executed. Indeed, the user input is expected to approve or change some values (such as the HTTP port).

Grunt config
Grunt config

However, on Travis, no user can answer the questions. Hence, the tool must be fully automatic.

When grunt is triggered, the gruntfile.js first checks if any configuration file already exist. If not, a default task takes place to run the configuration tool.

To support Travis, this default task processing was isolated in a folder of 'first-launch' configuration tasks. The name of the file is used to name the task.

The configuration tool was then modified to handle a so-called quiet mode and the Travis configuration task leverages this option. Additionally, the task also executes the quality tasks (linters, plato & coverage).

So, the command grunt travis is executed before the testing step to ensure the existence of the configuration file.

Challenges

After figuring out the different steps, a long debugging session - mostly composed of try and fail cycles - was required to figure out the specificities of the platform.

The history of pushes on the Travis integration issue illustrates all the attempts that were made.

Non-Windows environment

"If it's not tested, it doesn't work"

The very first issue was related to the environment itself. The development being made on Windows, the library has never been tested with a different operating system before.

Travis offers the possibility to use different flavors but the default one is Linux.

As a consequence, some scripts were adapted to handle the differences in the path separator (for instance).

NodeJS can absorb those differences. This is the reason why the library uses the / as a path separator but translates back to \ when running on Windows-specific host.

Browser testing

Another issue was related to browser testing. As a matter of fact, the Travis environment does not provide any graphical user interface by default. This being said, it means that you can't start a browser.

Fortunately, this is explained in the Travis documentation where several options are proposed.

And, luckily, the GPF-JS development environment provides an alternate way to test browsers.

Consequently, the headless mode of chrome is used.

Wscript

The operating system being Linux, the Microsoft Scripting Host is not available.

Unlike path separators, it is a normal situation as the configuration file includes a flag that is set upon detection of this scripting host.

However, one unexpected effect is bound to the code coverage measurement: a big part of the library is dedicated to bring older hosts to a common feature set and WScript was running most of it. Thus, the coverage minimum was not reached and the build was failing.

This was temporarily addressed by disabling the coverage threshold errors.

Also, thanks to the Wscript emulator project from Mischa Rodermond, a simple simulator implemented with node validates the WScript specifics.

There are still some uncovered part, but they will be fixed in a future release.

Other goodies

After digging through GitHub projects, I also discovered and integrated the coveralls package which can be used in conjunction with Travis so that coverage statistics are uploaded on a dedicated board.

I also added a dependency checker as well as a code quality checker (which required some cleaning).

Everything is visible from the project readme.

GPF-JS quality
GPF-JS quality