Skip navigation
1 2 Previous Next

Scott Stark's Blog

29 posts

I will be speaking at the GOTO Chicago event about Quarkus/MicroProfile this coming Monday, April 29

 

Session abstract:https://gotochgo.com/2019/sessions/878

 

Supersonic, Subatomic Eclipse MicroProfile

What if there was a way you could take advantage of the latest microservice architectures by leveraging the developers and skills you already have?!? What if you could do that with blazingly fast startup and crazy low memory consumption?!? In this session we’ll show you how with Eclipse MicroProfile and Red Hat’s Quarkus. We discuss all the cool features it allows you to easily use, such as OpenTracing, and Metrics.

Then we move onto a demo that showcases what’s possible with Eclipse MicroProfile, utilizing the existing specifications, running in Quarkus. We will develop a microservice that integrates all the specifications. By the end of the session the attendee will have a better understanding of Eclipse MicroProfile, and how to develop applications with Quarkus.

Speaking at GOTO Chicago

The Eclipse EE4J project has reached a milestone of having all of it's sub-projects available in Eclipse EE4J github organization with nightly and release builds run under the Eclipse CI environment. This includes the Jakarta EE 8 TCK project, the certification testsuite that had been a closed source offering only available to licensees under Java EE 8 and earlier. The upcoming GlassFish 5.1 release will be based on the release artifacts produced from the Eclipse CI environment. While GlassFish has been largely Open Source for some time, it is a big step forward for the Enterprise Java community to have all dependent projects available under the Eclipse Foundation infrastructure, and for the server to be certified against a build of the Open Source TCK. An overview of the GlassFish 5.1 sub-projects status can be found here. The final release of the GlassFish 5.1 server that is fully passing the Jakarta EE 8 TCK is expected to in Jan of 2019. The GlassFish 5.1 release plan can be found here. Red Hat expects our WildFly project to have a release that is passing the Jakarta EE 8 TCK in Jan of 2019 as well.

 

Part of this move has updated the group,artifact,id(GAV) coordinates of the project release artifacts. The details of the change from the old to new GAV coordinates is documented in the New_Maven_Coordinates wiki page.

 

The update of the Jakarta EE Specification Process continues to evolve. I again encourage you to read and provide feedback on the current draft available from Mike Milinkovich's blog post.

Today the first candidate release of the Eclipse Glassfish server targeting the new Jakarta EE 8 release is available. This was a huge effort to move the Glassfish source repositories over to the Eclipse github organization. It was accompanied by the move of TCK project as well. You can read about the details in Dmitry Kornilov’s blog here.


Red Hat’s Support of Jakarta EE

Red Hat is committed to supporting the evolution of enterprise Java at Eclipse and has been focusing on development of the Eclipse Jakarta EE specification process as well as helping with getting the migrated projects and TCK projects running under the eclipse CI infrastructure.

 

The new specification process is a replacement for the Java Community Process (JCP) used to develop the Java EE specification through Java EE 8. It provides a fully open source based process that includes specifications, APIs and TCKs. The Eclipse Jakarta EE specification process will be used to develop the next generation of the EE4J specifications. Mike Milinkovich has written about the current process draft status in detail here. The initial draft is in public review, so I recommend you take the opportunity to browse through it and make comments on the draft document provided in Mike’s blog post.

 

I encourage you to participate in the selection of the logo to use for the Jakarta EE project by going to the following Google Form and ranking your choices:

The Jakarta EE Logo Community Vote

There is a Jakarta EE Developer Survey 2018 that you can participate in to shape the next generation of the enterprise Java community.

Take Survey

 

More information:

Jakarta EE Working Group Charter

Jakarta EE Working Group FAQ

 

MicroProfile 1.2 Conference App Demo

This blog talks about running an updated version of the MicroProfile 1.2 based conference demo application on a local minishift cluster along with the latest Wildfly-Swarm 2018.2.0 supporting MicroProfile 1.2. You can find a PDF version of this blog along with a video walkthrough at: Release MicroProfile 1.2 Features Companion Demo - Minishift/WildFly Swarm 2018.2.0 · MicroProfileJWT/microprofile-confe…

 

Prerequisites

  1. Install the Minishift binary

  2. Install VirtualBox if needed

  3. Clone the https://github.com/MicroProfileJWT/microprofile-conference.git project

  4. cd microprofile-conference

  5. Start minishift using the config-minishift.sh script in the microprofile-conference root directory

  6. Configure your environment:

[starksm64-microprofile-conference 524]$ eval $(minishift oc-env) [starksm64-microprofile-conference 525]$ type oc oc is /Users/starksm/.minishift/cache/oc/v3.7.1/darwin/oc [starksm64-microprofile-conference 526]$ eval $(minishift docker-env) [starksm64-microprofile-conference 1568]$ oc login $(minishift ip):8443 -u admin -p admin Login successful.  You have access to the following projects and can switch between them with 'oc project <projectname>':      default     kube-public     kube-system   * myproject     openshift     openshift-infra     openshift-node  Using project "myproject".
  1. open the minishift console uisng minishift console

Build and Deploy the Microservices

  1. Build the services:

[starksm64-microprofile-conference 1559]$ mvn clean install -DskipTests=true [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Conference [INFO] Conference :: Bootstrap Data [INFO] Conference :: Authorization [INFO] Conference :: Session [INFO] Conference :: Vote [INFO] Conference :: Speaker [INFO] Conference :: Schedule [INFO] Conference :: Web [INFO] Conference :: Start ... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Conference ......................................... SUCCESS [  0.693 s] [INFO] Conference :: Bootstrap Data ....................... SUCCESS [  2.593 s] [INFO] Conference :: Authorization ........................ SUCCESS [ 12.907 s] [INFO] Conference :: Session .............................. SUCCESS [  8.802 s] [INFO] Conference :: Vote ................................. SUCCESS [ 12.265 s] [INFO] Conference :: Speaker .............................. SUCCESS [  9.020 s] [INFO] Conference :: Schedule ............................. SUCCESS [ 15.670 s] [INFO] Conference :: Web .................................. SUCCESS [ 33.957 s] [INFO] Conference :: Start ................................ SUCCESS [  0.032 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:36 min [INFO] Finished at: 2018-02-16T00:00:40-08:00 [INFO] Final Memory: 112M/1154M [INFO] ------------------------------------------------------------------------
  1. Install the services into the minishift environment using the cloud-deploy.sh script:

[starksm64-microprofile-conference 1571]$ ./cloud-deploy.sh [INFO] Scanning for projects... [INFO] ... deployment "microservice-vote" created service "microservice-vote" created route "microservice-vote" exposed [starksm64-microprofile-conference 1572]$ oc status In project My Project (myproject) on server https://192.168.99.100:8443  http://microservice-authz-myproject.192.168.99.100.nip.io to pod port http (svc/microservice-authz)   pod/microservice-authz-3124937629-wl8g7 runs example/microservice-authz:latest  http://microservice-schedule-myproject.192.168.99.100.nip.io to pod port http (svc/microservice-schedule)   pod/microservice-schedule-3040366544-n82zt runs example/microservice-schedule:latest  http://microservice-session-myproject.192.168.99.100.nip.io to pod port http (svc/microservice-session)   pod/microservice-session-1164112827-r8z9r runs example/microservice-session:latest  http://microservice-speaker-myproject.192.168.99.100.nip.io to pod port http (svc/microservice-speaker)   pod/microservice-speaker-2311407995-4mt9p runs example/microservice-speaker:latest  http://microservice-vote-myproject.192.168.99.100.nip.io to pod port http (svc/microservice-vote)   pod/microservice-vote-2774736211-wzzhz runs example/microservice-vote:latest  View details with 'oc describe <resource>/<name>' or list everything with 'oc get all'. [starksm64-microprofile-conference 1573]$
  1. Update the web-application/src/main/local/webapp/WEB-INF/conference.properties service URLs to use the value for minishift ip in your environment. In my environment 192.168.99.100 is the IP address. Globally replace 192.168.99.100 with whatever is returned in your minishift setup.

  2. Run the web application front end

mvn package tomee:run -pl :web-application -DskipTests [INFO] Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Conference :: Web 1.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] ... miTargets] to [true] as the property does not exist. INFO - Starting ProtocolHandler [http-nio-8080] INFO - Starting ProtocolHandler [ajp-nio-8009] INFO - Server startup in 3177 ms
  1. Open the web application http://localhost:8080/

Code Walkthrough

In this section we take a look at the code behind the Microprofile features in use in the conference application.

MP-JWT

The JWT RBAC for MicroProfile(MP-JWT) feature defines how JSON web tokens(JWT) may be used for authentication and role based authorization. The MP-JWT feature also defines an API for accessing the claims associated with JWTs. In the conference application demo, the microservice-session uses the JWT groups claim and a custom application claim. The following code snippet demonstrates the MP-JWT API.

Code from: microservice-session/src/main/java/io/microprofile/showcase/session/SessionResource.java
import org.eclipse.microprofile.jwt.JsonWebToken;   @ApplicationScoped public class SessionResource {       /**      * The current MP-JWT for the authenticated user      */     @Inject     JsonWebToken jwt; ...      @GET     @Produces(MediaType.APPLICATION_JSON)     @Timed     public Collection<Session> allSessions(@Context SecurityContext securityContext) throws Exception {         requestCount.inc();         if (jwt == null) { 
            // User was not authenticated             System.out.printf("allSessions, no token\n");             return Collections.emptyList();         }         String userName = jwt.getName();         // Use the isUserInRole of container to check for VIP role in the JWT groups claim         boolean isVIP = securityContext.isUserInRole("VIP");
        System.out.printf("allSessions(%s), isVIP=%s, User token: %s\n", userName, isVIP, jwt);         // Check if the user has a session_time_preference custom claim in the token         Optional<String> sessionTimePref = jwt.claim("session_time_preference");
        if(sessionTimePref.isPresent()) {             // Create a session filter for the time preference...         }          // If the user does NOT have a VIP role, filter out the VIP sessions         Collection<Session> sessions;         if (!isVIP) {
            sessions = sessionStore.getSessions()                 .stream()                 .filter(session -> !session.isVIPOnly())                 .collect(Collectors.toList());         } else {             sessions = sessionStore.getSessions();         }         return sessions;     }
1Injection of the MP-JWT token as a JsonWebToken interface.
2If there is no token, return an empty collection of sessions
3Check for a VIP role in the token using the container’s isUserInRole(String) method. This internally maps to the token’s MP-JWT defined group claims.
4Illustrates programatic lookup of a custom claim not defined by the MP-JWT spec. In this example the session results would be filtered to only return those matching the session time of day preference.
5This code block makes a check of the incoming MP-JWT token to see if it has a group claim that contains the VIP value. If the VIP claim does not exist, the sessions are filtered to remove those that the isVIPOnly property. Otherwise, all sessions are returned if the token has the VIP group.

MP-Configuration

The Microprofile config(MP-Config) supports injection and programtic lookup of external configuration information via a common API. The MP-Config spec defines 3 common configuration sources: * System environment variables * System properties * A META-INF/microprofile-config.properties

SPIs are defined for adding configuration sources as well as for converting from string to arbitrary types.

The microprofile conference app makes use of injection of META-INF/microprofile-config.properties, environment variables, and the conversion SPI. The first code snippet we will look at injects a value from the bundled META-INF/microprofile-config.properties as a java.security.PrivateKey. The AuthzResource from the microservice-authz project shows the injection:

Code from: microservice-authz/src/main/java/io/microprofile/showcase/tokens/AuthzResource.java,PrivateKeyConverter.java
import java.security.PrivateKey;  @ApplicationScoped public class AuthzResource {  /**  * An example of injecting a custom property type  */ @ConfigProperty(name="authz.signingKey") @Inject private PrivateKey signingKey; ...
# The META-INF/microprofile-config.properties entries authz.signingKey=/privateKey.pem 
import java.security.PrivateKey;  import org.eclipse.microprofile.config.spi.Converter;  import static io.microprofile.showcase.tokens.TokenUtils.readPrivateKey;  /**  * A custom configuration converter for {@linkplain PrivateKey} injection using  * {@linkplain org.eclipse.microprofile.config.inject.ConfigProperty}  */ public class PrivateKeyConverter implements Converter<PrivateKey> { 
    /**      * Converts a string to a PrivateKey by loading it as a classpath resource      * @param s - the string value to convert      * @return the PrivateKey loaded as a resource      * @throws IllegalArgumentException - on failure to load the key      */     @Override     public PrivateKey convert(String s) throws IllegalArgumentException {          PrivateKey pk = null;         try {             pk = readPrivateKey(s);         } catch (Exception e) {             IllegalArgumentException ex = new IllegalArgumentException("Failed to parse ");             ex.initCause(e);             throw ex;         }         return pk;     } }
1The config property name reference to match against a config source.
2The custom value PrivateKey value injection site.
3The mapping from the referenced "authz.signingKey" name to a string value in the standard META-INF/microprofile-config.properties.
4The custom converter implementation that takes the input string value and transforms it into a PrivateKey by loading it as a resource from the classpath.

A further example usage of the MP-Config will be seen in the next section on the health check feature.

MP-Health

The Microprofile health check(MP-Health) feature allows on to define application health check endpoints as commonly used in cloud environment to validate avaiability and liveness. The MP-Health feature supports this along with an ability to define a JSON payload that can be used to convey additional information.

The following microservice-session MP-Health code snippet shows an example health implementation that makes use of the MP-Config API to inject configuration that is used during construction the health response.

Code from: microservice-session/src/main/java/io/microprofile/showcase/session/SessionCheck.java
import org.eclipse.microprofile.config.inject.ConfigProperty; import org.eclipse.microprofile.health.Health; import org.eclipse.microprofile.health.HealthCheck; import org.eclipse.microprofile.health.HealthCheckResponse;  @Health 
@ApplicationScoped public class SessionCheck implements HealthCheck {
    @Inject     private SessionStore sessionStore;     @Inject     @ConfigProperty(name = "sessionCountName", defaultValue = "sessionCount")
    private String sessionCountName;     @ConfigProperty(name = "JAR_SHA256")
    @Inject     private String jarSha256;      @Override     public HealthCheckResponse call() {
        return HealthCheckResponse.named("sessions-check")             .withData(sessionCountName, sessionStore.getSessions().size())
            .withData("lastCheckDate", new Date().toString())             .withData("jarSHA256", jarSha256)             .up()             .build();     } }
1The annotation marking the bean as a health check endpoint.
2The HealthCheck interface the endpoint implements to provide the health callback.
3An example of externalizing a data label used in health check response whose value is defined in the application META-INF/microprofile-config.properties.
4An example of injection of a config value whose source is an environment variable that is defined in the microservice-session openshift deployment descriptor.
5The HealthCheck call endpoint that returns the HealthCheckResponse.
6The various withData calls add labelled values, including the injected config values, to the JSON payload.

MP-Metrics

The Microprofile metrics(MP-Metrics) feature aims to provide a unified way for Microprofile services to export Monitoring data via common API.

Code from: microservice-session/src/main/java/io/microprofile/showcase/session/SessionResource.java
import org.eclipse.microprofile.metrics.Counter; import org.eclipse.microprofile.metrics.Histogram; import org.eclipse.microprofile.metrics.Metadata; import org.eclipse.microprofile.metrics.MetricRegistry; import org.eclipse.microprofile.metrics.MetricType; import org.eclipse.microprofile.metrics.annotation.Metric; import org.eclipse.microprofile.metrics.annotation.Timed;  @ApplicationScoped public class SessionResource {      @Inject     @Metric(name = "requestCount", description = "All JAX-RS request made to the SessionResource",         displayName = "SessionResource#requestCount") 
    private Counter requestCount;      /**      * The application metrics registry that allows access to any metric to be accessed/created      */     @Inject     private MetricRegistry metrics;     @PostConstruct     void init() {         Collection<Session> sessions = sessionStore.getSessions();         System.out.printf("SessionResource.init, session count=%d\n", sessions.size());         // Create a histogram of the session abstract word counts         Metadata metadata = new Metadata(SessionResource.class.getName()+".abstractWordCount", MetricType.HISTOGRAM);
        metadata.setDescription("Word count histogram for the session abstracts");         Histogram abstractWordCount = metrics.histogram(metadata);
        for(Session session : sessions) {             String[] words = session.getAbstract().split("\\s+");             abstractWordCount.update(words.length);
        }     }      @GET     @Produces(MediaType.APPLICATION_JSON)     @Timed    public Collection<Session> allSessions(@Context SecurityContext securityContext) throws Exception {         requestCount.inc();...     }      @GET     @Path("/{sessionId}")     @Produces(MediaType.APPLICATION_JSON)     @Timed(6)    public Response retrieveSession(@PathParam("sessionId") final String sessionId) throws Exception {         requestCount.inc();...     }       @GET     @Path("/{sessionId}/speakers")     @Produces(MediaType.APPLICATION_JSON)     @Timed    public Response sessionSpeakers(@PathParam("sessionId") final String sessionId) throws Exception { ...     }
1Define a Counter type metric named requestCount.
2Injection of the MetricRegistry interface allows for programmatic creation and lookup of metrics as will be done in init().
3Sets up the metadata for an abstractWordCount metric of type Histogram.
4The actual creation of the Histogram metric via the injected MetricRegistry instance.
5Population of the abstractWordCount from the various session abstracts.
6The allSessions, retrieveSession and sessionSpeakers endpoint methods are annotated with @Timed to indicate that the MP-Metrics layer should intercept the method invocations and create statistics for them.
7Programmatic updates of the injected requestCount metric are seen in the allSessions and retrieveSession endpoint methods.
Last updated 2018-02-22 08:48:28 PST

 

 

This blog introduces concerns that members of the Red Hat middleware team, Apache Maven chair, Paremus, Sonatype, as well as other Java Executive Committee(EC) members have regarding the JSR-376 Java Platform Module System specification, and the Jigsaw implementation of that specification. These concerns have arisen from Red Hat's participation in the JSR-376 expert group(EG) and experience with the Jigsaw early access releases.

 

This is a rather long posting, so I have attached a pdf version of the contents that includes a table of contents with links for easier navigation.

 

 

An analysis of the Jigsaw technology and its relationship to JPMS (JSR-376)

Contributors

  • David Lloyd (Red Hat)
  • Jason Green (Red Hat)
  • Scott Stark (Red Hat)
  • Mark Little (Red Hat)
  • Mark Proctor (Red Hat)
  • Robert Scholte (Chairman, Apache Maven project)
  • Neil Bartlett( Paremus)
  • Brian Fox (Sonatype, ASF)

 

Summary

Reinvention, Not Standardization

The Jigsaw implementation is a new module system which is has worked successfully for modularising Java itself, but is largely untried in wider production deployments of any real applications on top of the JVM.  Many application deployment use cases which are widely implemented today are not possible under Jigsaw, or would require a significant re-architecture.

Reductive Design Principles

Jigsaw's key design points are predicated on a reductive approach to forward compatibility, which works well for modularising Java itself, but becomes restrictive for the broader use cases that application deployments have. By enforcing the philosophies that make sense for modularization and encapsulation of the Java platform itself into the application domain, the specification actually reduces the ability for application developers to easily adapt to this particular implementation of a module system.

As a result of drawing requirements from the prototype implementation’s primary behaviors, the set of use cases which are now considered acceptable have been limited to conform to implementation preference. We would prefer to have extracted the requirements and design from existing application deployment use cases.  Many practices which were considered routine and useful in Java are now redefined as anti-patterns in Jigsaw, as described in the Technical Challenge Points section of this document (e.g. “Cyclic Dependencies”, “Concealed package conflicts”, “Reflection Behavioral Changes”, “Module Naming Restrictions”,  “Adding packages is necessary”, “Service Loading Changes” “Resources and Modules”).

This results in a subtraction of capabilities for any code consuming Jigsaw. Conversely, we believe that JPMS should be conceived as a fixed set of added capabilities which allow for new use cases, without excluding existing use cases from being able to migrate to or take advantage of modularity.

A Disrupted Ecosystem

Jigsaw's implementation will eventually require millions of users and authors in the Java ecosystem to face major changes to their applications and libraries, especially if they deal with services, class loading, or reflection in any way.  Most of these changes are derived from the implementation choices of Jigsaw and the requirements that were drawn from it.

The specification was written to promote certain best practices (e.g. modules are the ultimate authority for determining package access and dependency information, modules should be immutable with a complete eagerly resolvable dependency set, packages should never be duplicated, dependencies should never contain cycles, etc) .  This works well for modularising Java itself but is a new, untested, and unproven architecture for deploying applications in a modular manner. In some cases the implementation of Jigsaw contradict years of modular application deployment best practices that are already commonly employed by the ecosystem as a whole.

Fragmentation of the Java community

Due to lack of one to one mapping of use cases (or sufficient interoperability capabilities) and other restrictions, we are concerned that there will likely be two worlds of Java software development: the Jigsaw world, and the “everything else” world (Java SE Classloaders, OSGi, JBoss Modules, Java EE, etc). A library developer will either need to pick which world to support, or deal with the burdens of a 'maintaining both' strategy.

Failure to Meet Major JSR Submission Goals

The JSR submission goals outline certain expectations that are integral to acceptance of the JPMS final release. Several of these goals are not met by the current Jigsaw implementation.

Approachable, yet scalable

The JSR submission specifically expresses that the implementation should support large-scale development.  The submission states that:

"This JSR will define an approachable yet scalable module system for the Java Platform. It will be approachable, i.e., easy to learn and easy to use, so that developers can use it to construct and maintain libraries and large applications for both the Java SE and Java EE Platforms."

For the purpose of evaluating this subjective goal, we define “easy to use” as:

  • Having equal or greater robustness (tolerance of user input) to Java today
  • Having equal or lesser effort required by the user to “construct and maintain libraries and large applications” in Java today

Constructing a Jigsaw application is definitively less robust than Java today as Jigsaw imposes a number of additional restrictions that will result in errors not previously encountered (see sections covering concealed package conflicts, split packages, duplicate packages, multiple module versions, module naming restrictions, cyclic dependencies, JSR-250’s awkward place, service loader changes, reflection behavior changes, etc). Also, constructing a Jigsaw application definitively requires more effort by the user to “construct and maintain libraries and large applications”. A Jigsaw application/library must either define one or more additional module descriptors (module-info.java) that accurately define the semantics of the respective module, or utilize automatic modules, which involve the use of additional special rules that must be taken into account by the user.Additionally, there is a an impedance mismatch with widely adopted practices for assembling software in the Java ecosystem, as expanded on in the Impedance Mismatch with Maven section.The extra burden imposed, logically scales proportional to the size of an application, as does the probability of an error generating input / restriction violation   Therefore, we believe that this JSR goal appears unmet, in particular for the target class of “large applications” (that will commonly involve the blending of multiple independent projects).

Leveraged by Java EE 9

It has been made clear since the beginning of the JSR process that it is expected to provide a basis upon which Java EE 9 can be built.  As stated in the submission:

"This JSR targets Java SE 9. We expect the module system to be leveraged by Java EE 9, so we will make sure to take Java EE requirements into account."The limitations in Jigsaw almost certainly prevent the possibility of Java EE 9 from being based on Jigsaw, as to do so would require existing Java EE vendors to completely throw out compatibility, interoperability, and feature parity with past versions of the Java EE specification. 

Concerns about Jigsaw as a complete solution

The patterns introduced within Jigsaw are (in some cases) going to be extremely difficult to fix even in a later release, and will create backwards- and forwards-compatibility problems that will be very difficult to unwind.  The result will be a weakened Java ecosystem at a time when rapid change is occurring in the server space with increasing use of languages like Go.These problems, which are outlined in detail this document, range from adoption issues, to changes to distribution models, to fragmentation of the ecosystem and more.

A Visual Comparison Between Modular Implementations

The following table serves as a high-level summary of some of the more significant capabilities which are not met  by the Jigsaw approach relative to existing modular system approaches.  The individual points are expanded upon in greater detail in the technical points section of this document.

JigsawClass-LoaderOSGiJava EE (Spec)ext/lib
Allows cycles between packages in different modules✔︎✔︎✔︎✔︎
Isolated package namespaces✔︎✔︎✔︎
Allows lazy loading✔︎✔︎✔︎✔︎
Allows dynamic package addition✔︎✔︎✔︎
Unrestricted naming✔︎✔︎✔︎
Allows multiple versions of an artifact✔︎✔︎✔︎✔︎
Allows split packages✔︎✔︎✔︎✔︎
Module redefinition✔︎✔︎✔︎✔︎
Allows textual descriptor✔︎✔︎✔︎✔︎
Theoretically Possible to AOT-compile✔︎✔︎✔︎✔︎✔︎


Refining Jigsaw and timelines

Many of the issues could be fixed in a short amount of time, (e.g. layer primitives, circularity, version restrictions, etc.).  Others might require a bit more time to get right, but would lead to a much better overall platform and user experience.  A small delay is worth the cost if the alternative is rushing a solution that doesn't cover all use cases.  It might also be possible to add additional hooks that could be leveraged by third-party code to improve the experience.

Technical Contention Points

Cyclic Dependences

The implementation forbids dependency cycles among modules during compilation, link, and run time.  Disallowing cycles during compilation is an accepted and historical behavior in Java, however disallowing cycles at run time is not, and will cause surprising problems for the user at deployment time.  Such cycles might even reflect engineering choices that are required to fulfill certain use cases.

Current specification

The Public Review specification has the following to say on the matter:

 

 

 

"It is a compile-time error if the declaration of a module expresses a dependence on itself, either directly or indirectly." - proposed JLS § 7.7.1

"When all modules have been resolved then the resulting dependency graph is checked to ensure that it does not contain cycles. A readability graph is constructed, and in conjunction with the module exports and service use, checked for consistency." - proposed JDK specification for class java.lang.module.Configuration

 

The proposed JVM specification does not specify that module cycles are forbidden during class resolution or initialization.

Run-time problems

When modules are built, they are compiled against a set of classes which form the Application Binary Interfaces (ABIs) that the module requires in order to function.  But it would often be the case that the final module is then included in a different environment entirely - either in a container, or else as a result of reuse.  Nontrivial module environments can easily contain "long cycles" where a number of innocuous dependency relationships exist, but happen to form a cycle when certain combinations of modules are assembled.

Bypass the resolver

JPMS authors recommend that runtime support for circularity be added by container providers such as OSGi, JBoss Modules, or Java EE containers by bypassing the Jigsaw resolver completely, and using a custom class loader implementation to resolve class linkage questions.This solution is completely functional for containers, but it is not functional for stand alone modular applications.  In addition, containers will suffer from the deficiency that any software which inspects such a module's dependencies using the java.lang.reflect API, including a module inspecting its own, will see only a subset of them (typically, an empty set).

Compromise proposal

A compromise proposal is to continue to forbid cyclic dependences at compilation time (as this behavior is consistent with current practice and javac behavior), but to relax restrictions at link and run time so that assemblies of modules will not fail unexpectedly when the dependency graph changes between the build environment and the production environment.This proposal has not yet been addressed.

 

     Quotes:

"The JPMS resolver does not allow cycles amongst modules; this has long been the case.  (Circularity amongst classes is allowed, as it must be.) If you want to allow cycles amongst your own modules then you can resolve them yourself and add whatever cycle-inducing readability edges you need." - Spec Lead, in this post (2016)

Ideology

This is at its heart an ideological disagreement.  It has been posited that the presence of circular dependencies is an anti-pattern and a design error: http://openjdk.java.net/projects/jigsaw/spec/issues/#CyclicDependences.The primary supporting argument is that all modules which form a cycle are logically one module.  However this at best applies only to limited cycles of a small fixed number of modules which come from the same author and are produced at the same time.  Applications, even relatively small ones, now consist of dozens or hundreds of distinct pieces from a multitude of sources.  Maven has been a big enabler of this: By allowing application dependencies to be managed automatically the friction of doing so has been greatly reduced, and it has been observed that including substantial dependency graphs in an application as a common practice has increased in pace.

Automatic Modules

Automatic modules are purported as a compatibility mechanism allowing JARs to naturally grow into modules in a modular environment.The idea is that a module would be automatically generated from a JAR which has a name that is derived from the name of that JAR.  The name would undergo various transformations to make it align with the proposed naming convention of modules.The proposed behavior suffer from various undesirable side-effects.  Many participants in the discussion seem to agree that automatic modules bring more harm than good.

 

     Quotes:

"... automatic modules in general are not a good solution to the problem space in general" - Stephen Colebourne, JSR 310 spec lead in this post

"I regard automatic modules as one of the most dangerous and poorly specified areas of the current spec, and will be taking this up with the other members of the EG." - Neil Bartlett, current JSR 376 EG member in this post

New Tooling Required

Users will be relying on build tooling like Maven to create their modules and their distribution environments.  Already today there is at least one tool (https://github.com/moditect/moditect) which can modularize an archive, and it is expected that more will appear.If the other issues listed in this document can be resolved, modularizing a JAR could be as simple as choosing a name and reviewing the results of the calculation of existing modules and Maven dependency metadata.  Even manually specifying dependencies could be a fast and easy way to modularize an existing artifact.

Tooling Prevented from helping bridge the module name chasm

Recently the Module-Name metadata field was removed from the proposal. This field would have allowed a developer to express their intended module name separately from fully modularizing their own code. This would allow someone to avoid their otherwise legacy module from being subjected to the default automodule name algorithm which only uses elements of the filename as the module name. Not having Module-Name available creates an inherently unstable automatic module naming solution, and will likely cause conflicts between otherwise properly namespaced modules.
For the reasons of name instability, the current guidance (for adopters of Jigsaw) is to block or discourage publishing of libraries that depend upon automodules. This leads to the problem where no library creator can ever fully modularize until all of his/her dependencies have also done so!
With an ecosystem that has transitive dependencies (sometimes dozens to hundreds of layers deep) and with some of those very deep dependencies quite stable and therefore infrequently updated, this will likely mean that some components will never be able to be fully modularised.
The vision behind the Module-Name metadata was simply that it would make it easy for module authors starting with Jigsaw to immediately to choose their module name. It could make choosing and declaring a name easy, (maybe even required) very quickly for library authors. That means that the ecosystem could start to build up the metadata that is missing. The JSR could still do so (starting now), and as Jigsaw starts to hit critical mass, there would hopefully be very few important libraries that aren't properly named by their authors as intended.
The bar to picking a good name is clearly much lower than fully modularizing, especially if  there is social pressure into doing so before all your dependencies have.
If developers could start declaring their Module-Name early, and the rule against automodule dependencies is redefined such that it's OK to lean on something with a Module-Name, it becomes much easier and quicker for the ecosystem to get to a building point for full Jigsaw modularization.
Without the Module-Name metadata or some equivalent, build systems are effectively barred from helping with the conversion to achieve the very goal of this entire process.

Inconsistent behavior

Automatic modules have special behaviors that are not shared by the classpath or by modules, including allowing cycles, having access to all modules, and being unable to restrict visibility or accessibility in the way that named modules can.  Thus as a migration tool, it is problematic to rely upon them.Automatic module naming follows new patterns and relies on JAR naming conventions, with no option to customize the automatic module's name unless the JAR is renamed during assembly.

Inadequate Isolation

An expectation of a module system is that a module’s implementation choices are independent of other modules in the system. Java EE, OSGI, and other plugin systems incorporate isolation systems characterized by such concepts as fully isolated package namespaces and separated module classloaders. Another example is Dynamic libraries (e.g. DLLs, SOs) which support isolation of symbols.  A module system without adequate isolation will be unlikely to cope with an ecosystem which consists of modules produced by many different authors with different design parameters.

Multiple module versions

The JPMS Spec lead specifically chose not to solve multiple version resolution situations.

 

#MultipleModuleVersions — Allow multiple distinct modules of a given name to be loaded in a convenient fashion, without using reflection. This could be done by creating new layers automatically, or by relaxing the constraints on multiple versions within a layer, or by some other means (cf. #StaticLayerConfiguration, #AvoidConcealedPackageConflicts). Addressing this issue may entail reconsidering the multiple versions non-requirement. [Mike Hearn]

Resolution These overlapping issues do reflect actual, practical problems.  There
are, however, already effective -- if somewhat crude -- solutions to
these problems via techniques such as shading (in Maven) and shadowing
(Gradle).  More sophisticated solutions could be designed and implemented
in a future release.

The lack of immediate solutions to these problems should not block a
developer who wants to modularize an existing class-path application.
If such an application works properly on the class path today then it
likely does not have conflicting packages anyway, since conflicting
packages on the class path almost always lead to trouble.

                    [Spec Lead]

 

This decision appears to be the result of the implementation choice of the Jigsaw authors to attempt to use a single class loader for all JDK modules, and then reuse that approach for application modules on the module path (a problem which is addressed elsewhere in this document).One critical specification problem is that there is no clear definition of what constitutes "multiple versions" of a module.  Jigsaw uses the following interpretation:

  • Two modules with the same package names in them are considered to be different versions of the same module.
  • Two modules with the same name are considered to be two versions of the same module.

The problem with both of these rules is that there are cases where the two modules in question are not different versions of the same module. Examples include usage of generic common names as an identifier (“util”, “beans”, “logger”,”client”, etc), and competing distributions / variations of a standard (e.g. JSR) or common API. Therefore, the aforementioned restriction not allowing any situation that could be interpreted as multiple versions causes a serious problem for these cases.

Concealed package conflicts

When two modules have the same package name in them, but the package is private in both modules, the module system cannot load both modules into the same layer.  This situation is known as a "concealed" package conflict, because although there is no user-visible reason for a conflict, it exists nonetheless due to inadequate module isolation. This also implies that any future tool (Maven plugin, etc) that seeks to assemble a coherent set of modules for the modulepath cannot rely only on the published metadata of the modules. It must introspect within each and every module to the package level to determine whether any conflicts exist.Handling this situation is a primary characteristic of existing module and plugin systems, including the built-in Java SE ClassLoaders  While Jigsaw can support a ClassLoader per module configuration, doing so requires a user to develop a custom bootstrap process. The standard JVM launch (using -p) will fail immediately if any module contains the same package, even if it is not exported.

Non-concealed, non-conflicting duplicate package names

A similar case is where two modules have exported public packages of the same name. A scenario where this can occur is when dependencies require two ABI incompatible versions that share the same name. For example, one library might use methods in Guava 18 that were dropped in Guava 20, and another library might use methods in Guava 20 that do not exist in Guava 18. As with the concealed case, existing module systems are able to handle this use case, but it will fail on a standard JVM launch under the current Jigsaw implementation.

Split packages

"Split packages" is historically a controversial topic in Java.  This case arises when there are non-concealed and non-conflicting duplicate package names, and there is a module which consumes both of the duplicated packages.  In this case, some classes may bind to classes in one package, and some may bind to the other, or they may only bind to one or the other.This is indisputably an advanced use case, and there are many approaches to handling this at an application level.  However we believe that at a specification level, there is no technical reason to restrict this situation on a basis more strict than opt-in.

A simple solution: class loaders

Most of the issues described in this section derive from the design decision to force all modules from the module path into a single class loader, and to a lesser extent, the design decision to force platform and application modules into a single layer.The module API provides methods to construct layers which map each module into its own class loader.  However this mechanism is not used by the JDK for applications, even though it is able to solve all of the issues in this section.  The primary argument for this situation revolves around a predicted compatibility issue, that applications, once converted to Jigsaw, may be surprised that getClassLoader() returns a different value respective to the jar file that contains it. However, there are other more severe (and more common) compatibility breakages introduced by Jigsaw in the same situation that expose the weakness of this argument:

High-level JEP that describes CL topology restructuring

Example breakage caused in Gradle

Example breakage caused in Eclipse

Example breakage caused in CDI

No "Current Module"

Existing systems rely on identifying the current application by using the Thread Context Class Loader (TCCL).  Because modules in Jigsaw are not represented by class loaders, programs which rely on this behavior of the TCCL will begin to exhibit subtly incorrect behavior.No corresponding concept exists for modules, which means that any software relying on this concept must be redesigned and re-implemented to use some different approach.

Lack of Mutability

A desired characteristic of existing modular runtimes is that modules can be dynamically installed and redefined (often referred to as hot and/or incremental deployment). Jigsaw does not support this directly (http://openjdk.java.net/projects/jigsaw/spec/issues/#MutableConfigurations), but introduces a hierarchical grouping called Layers.  We believe the hierarchical nature of this solution is a poor fit for supporting updates to modules, which other module systems have commonly found to be nodes with peer-to-peer relationships that form a graph.
While Layers were enhanced to support multiple parents, the solution cannot be used to model a graph (since layers cannot have cycles). Non-trivial usage of the Layers capability does not scale, as it creates very large search paths instead of a desired O(1) resolution.  Several applications will be forced to bypass and / or reimplement Jigsaw's class loading and resolution in order to get their desired behaviour.

Hierarchical layers don't meet application needs

Existing Module systems have all been built on the experience  that hierarchical linkage systems (such as the traditional classloader relationship) don't meet modern application deployment needs.  Hierarchical implementations introduce locking problems, visibility issues, complex resolution for parent / child-first dilemmas and other issues have demonstrated the weakness of that approach.Hierarchical Layer relationships in Jigsaw also suffer from similar problems, e.g., a linear scan of all parents for all modules.

Modules always loaded eagerly

In order to support the restrictions imposed by Jigsaw, modules are always loaded and resolved eagerly within a layer - even if there are hundreds or thousands of modules on the module path.  This is opposite with the classical behavior of classes on the classpath, which are always loaded, resolved, and initialized on an as-needed basis.  We believe that the as-needed loading mechanism is a useful model. Existing, module frameworks such as JBoss Modules and OSGi resolve lazily.In Jigsaw, platform modules must be divided into two groups: the eagerly resolved platform modules, and a set of optional modules that are only loaded if explicitly specified on the command line.  This can be awkward, particularly if a module's requirement is only discovered late in execution.In Jigsaw the JVM module path cannot have modules added to it at runtime.  This is more restrictive than the existing model with classes in a classloader, i.e., a package can always have more classes dynamically added to it, which has been repeatedly proven to be a very useful mechanism for library and framework authors..

Layer Primitives

There are proposed primitives for Jigsaw (from EG members) that add the ability to dynamically modify a module in a few specific ways, is necessary and useful to developers and users of existing containers and plugin systems.

Primitives already exist, modules are already dynamic

All of the proposed primitives already exist within Jigsaw.  Modules themselves have the ability to do things like add exports which the layer controller cannot do without injecting bytecode into the module to call these methods.Many frameworks generate proxies and other bytecode with security needs that would entail using new private packages, but the spec lead and other EG members disagree with these use cases (see the following section: A small change: a dozen lines).

Adding packages is necessary

Many frameworks, containers, tools, and libraries (including the JDK itself) make use of dynamic code generation to implement various types of functionality.  These frameworks have the same need as the JDK to generate classes in non-public packages and may not be able to function if unable to do so.Containers and plugin systems also often adhere to the practice of lazy discovery of classes.  In these cases, in order to properly interoperate with Jigsaw, such frameworks must be able to dynamically add packages and other module characteristics as they are discovered.

A small change: a dozen lines

The code to make this change is a very small patch that exposes a small number of methods already present in the implementation.  This was proposed in: (http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2016-December/000501.html and http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2016-December/000507.html) and ultimately rejected without a technical justification:

I have too often seen APIs that seemed like a good idea at the time but were, in fact, woefully deficient, baked into the Java Platform where they fester for ages, cause pain to all who use it, and torment those who maintain it.  I will not let that happen
here” - Spec Lead in JPMS posting rejecting the change

Module Naming Restrictions

Module Names

Since the initiation of Jigsaw into JPMS, module names have been restricted by the rule that they must be, (or approximate), valid Java language names.  This rule excludes a large number of artifacts in existing module systems and in Maven.Many artifacts within Maven contain hyphen ("-") characters, which are not allowed by the module naming rules.  Also, the colon (":") delimiter (used to separate artifact IDs from group IDs in Maven) is also disallowed.Containers have the ability to bypass these naming restrictions, but in order to do so, they must generate bytecode for their module descriptors (as the descriptor building API enforces the javac naming rules).

Module Version Strings

Module version strings in Jigsaw are constrained by a format which does not reflect any current versioning practices.  They are therefore incompatible with most existing Java-based versioning schemes.The implementation of version strings in Jigsaw involves several ists of Objects and extensive usage of boxed types.We believe that a module system should support versioning schemes that reflect a users' or containers' best practices in common use, while also making recommendations for those cases where a practice is not established.  We believe each module loading layer should be able to establish its own policy for syntax, semantics, and ordering which operate solely within the realm of that layer and do not interfere with that of other layers.

Module Descriptors Are Bytecode

The Jigsaw implementation mandates that module descriptors should be established and loaded in bytecode format.

Architectural concern

Binary descriptor formats are an uncommon implementation choice.  Text-based descriptor formats (particularly those based on common meta formats like properties, MANIFEST.MF, or XML) are easier to read, modify, and programmatically manipulate using standard tools.  The cost of parsing such files is minimal and in some cases nearly indistinguishable from their binary counterparts.

Modules should not be a part of the Java language specification

It has been suggested on multiple occasions that module descriptors do not make sense as bytecode for a variety of reasons (http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2015-December/000212.html, http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2015-September/000125.html).  However, these arguments were met with the assertion that modules are "fundamentally, a new kind of Java program component" and that it therefore has to be "specified in both the Java Language Specification (JLS) and the Java Virtual Machine Specification (JVMS)."  This assertion was subsequently contested in a post to the JPMS spec experts list in 2015.The argument that Jigsaw behaviors must be part of the JLS is used to justify the storage format and compilation behavior of Jigsaw descriptors.  However, the argument that application modularity on top of the JVM must be part of the JLS has not been strongly supported by technical arguments.  A number of successful application module, plugin, and class loading systems exist without the necessity of elevating modules to a programming language level.Even if the enhanced security and diagnostic features that the JVM provides are brought into consideration, there is no new behavior which has been shown to be required or otherwise made possible by the current implementation or JLS modification as these are all run-time behaviors and enhancements.

Service Loading Changes

Behavioral changes

The contract of ServiceLoader was established over 10 years ago in Java 6 and is now considered a standard way to locate providers for interfaces.  Jigsaw changes the behavior of this API in substantial and compatibility-affecting ways, e.g. http://download.java.net/java/jigsaw/docs/api/java/util/ServiceLoader.html and http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2016-December/000524.html. The changes are discussed in the following sections.

No relative services

With traditional ServiceLoader, the services you load would be based on what the current class's class loader, or the specified target class loader, could locate.  This allows services to be intuitively defined on a relative basis in class loader-oriented systems.This behavior is removed for modules under Jigsaw.  A different, module-based service locator is used by default which does not have relative behavior.

Ordering is lost

In Java 6 through 8, ServiceLoader reported services in the order they are discovered by the class loader, meaning the class loader could implement a reasonable and predictable policy for returning implementations.Jigsaw does not specify the order that services are returned within a layer, which will cause unforeseen stability problems as applications may be relying on a preferred load order.

Global namespace

In Jigsaw, all service interfaces and implementations on the module path are flattened into a single, global namespace.This means that it is impossible to selectively assign service implementations, or to get a predictable result if the same interface exists in more than one location in the module graph, among other problems.

No extensibility / customizability

In Jigsaw, there is no API by which the behavior of service loading can be customized or modified back to its original behavior, unless Jigsaw modules are not used at all.  That is, the special behavior and privilege of service loaders cannot be replicated by user code.  Even when a customized layer is used, the layer provider must provide a fixed mapping of available services up front.

Strong restrictions

Every module that uses a service must also declare that the service is being used in the module descriptor.  Most existing service wiring frameworks are moving away from multiple-site declarations, as this has been found to cause issues.Failure to declare a service that is being used results in a run-time error, which can be surprising, and also prevents any sort of dynamicity in terms of finding an implementation.Java 9-aware software can dynamically add a uses declaration to their own module before loading a service, but adding this at scale will be a difficult task.These service loader changes were introduced as a balm against the rules regarding circularity.  However the changes cause new problems.  Sticking with the relative behavior would allow modules to choose their services and their implementations in the same established way that they always have, using dependency edges to create a predictable set and ordering for services.We believe that the addition of a global or layer-wide service registry (as a new, supplementary feature) would be an example of a useful (and fully compatible) change that solves the same sorts of configuration problems.

Reflection Behavioral Changes

Jigsaw introduces new restrictions on private reflection which entail disallowing the setAccessible() method of reflection entities from being invoked from modules which are not specifically granted access to the module in which the corresponding member exists.  However this restriction is not consistently applied, i.e legacy classpath-based code, as well as the unnamed module, both are exempt from these restrictions.

Security justification

The security justification is clear: less reflective access means fewer CVEs.  However, the security justification must be weighed against impact of the new restrictions on compatibility and usability.  Increasing security is of less use if existing or new software cannot  take advantage of the new capabilities.

Compatibility impact

New module access constructs have been introduced, which allow modules to opt in to allowing inter-module reflection.  However, this mechanism has compatibility concerns.Each existing artifact that is being modularized must consider the reflection accesses made by that artifact and decide what consumers must do in terms of opening access.  Because it is the module that is being reflected upon that must grant access, it is not until run time that reflection access problems can be detected (because there is no way for a module to declare that its users must open themselves for private reflective access), or to test for that at build or load time. Examples of how users must deal with runtime errors rather than compile time errors in both JavaFX and GSON have been posted to the JPMS comments list.

 

     Quotes:

"I have argued that the Java security model should be brought up to date, but I understand that requires a far reaching redesign that is beyond the remit of the modularity EG.  That means that modularity should 'do no harm', while avoiding those land mines you refer to below." - Tim Ellison (IBM) in this post to the JPMS spec experts list in 2015

Specification and framework impact

It is unclear how other specifications (such as CDI or JPA) are to be granted access to the modules which consume them, especially in an embedded or containerless context (such as one might find in a cloud-style deployment).  The consuming module must somehow grant open access to the specification implementation, but the groups responsible for moving such specifications to their Java 9-ready forms are unlikely to be willing to require users to establish dependencies on modules other than the specification API module.  This implies that the specification API itself must be tailored to the implementation, or in some other general way be able to relay privileged access to implementations.

Problems with "the big kill switch"

Because of the scale of the compatibility problems, the JDK has lately added an option to blanket-disable the additional reflection security capabilities.  The change itself introduces a new problem: log messages are emitted to the error stream regardless of any application use of that stream.

     Quotes:

"The big kill switch doesn't seem useful, it just hides everything that needs work." - Keimpe Bronkhosrt (Oracle), in this post to jigsaw-dev

JSR-250 Challenges

In order to remain relevant in the modular world, a module implementing a specification will need to be consumable in a predictable manner by applications; in particular, each specification will require a predictable name and clear requirements for consumption.  In the case of JSR-250 (the javax.annotation package), the Java SE platform has included the classes for quite some time.  However the classes included in the platform have lagged behind the specification in the past, and there is a general desire to move them out of the platform for this reason and for the reason that they do not necessarily belong in the platform.  The current Jigsaw proposal seeks to do so but has challenges.The current proposal to rename the bundled JSR-250 module from java.annotation to javax.ws.annotation, citing the history of JAX-WS within the platform.  The user must then manually enable that module, along with any other JAX-WS support modules, to use the container-bundled JAX-WS implementation.  The module will also be deprecated, encouraging use of an external version.However this poses a challenge: How does a module distribution employ an updated version of this module at its defined name?  One option is to ignore the provided module and bundle a new java.annotation module.  However, this option causes a problem when the built-in JAX-WS support is in use, as the packages in the new module will conflict with the module in the JDK.  To get around this, a developer will need to upgrade the javax.ws.annotation module and establish a pseudo-module that aliases javax.ws.annotation to java.annotation.This is awkward for the developer.  Ideally, the specification classes should be included under their specification name, and made upgradeable.  They can then be deprecated from the platform if necessary.

Resources and Modules

Resources are used for a variety of purposes, from data supplementation to configuration to service description.  Historically, a class could use the Thread Context ClassLoader or its own class loader (depending on circumstance) to locate such resources, and this model works consistently.In a multiple-module system (whether or not the module is backed with a dedicated class loader), it is useful to be able to find resources from other modules, and to know which module each found resource originated in, encapsulated in a single object which provides access to the content as well as the size and origin of the resource.  This idea was proposed for the JDK in a 2009 bug report but has found little traction.Under Jigsaw, inter-module resources has no modular support for this function (despite a number of new module-aware and classloader-incompatible resource APIs were added).The service support which previously used the general resource support could leverage such a mechanism for added flexibility in a modular setting. At the moment only a one-off function that uses module implementation details is provided.

Distribution Model

An important aspect of a module system is how it manages independently developed, versioned, and packaged units of software. Two common approaches to this problem are overridable descriptors and flexible resolution systems.

Overridable Descriptor Approach

The overridable descriptor approach allows for a module to redefine the module specification of its full tree of dependencies. This allows for modules to potentially publish their details on a best effort basis, with a built in mechanism for consumers to adapt to conflicts as necessary.  The ability to adapt allows for organic evolution of the system without requiring any form of coordination between participants. Examples of this approach include Maven and JBoss Modules.

Flexible Resolution System Approach

Another approach relies on a flexible resolution system that analyzes detailed data about the modules in the system (dependencies, available packages, etc), and produces a solution accounting for the requirements of each module and the variations available (e.g. versions). This approach also has the ability to adapt to independent life-cycles. OSGi utilizes this approach.

Jigsaw - A new approach with challenges


Jigsaw, on the other hand takes a different approach. It does not have a flexible resolution system, nor adequate metadata to generate solutions for independent  software composition.  It also explicitly seeks to avoid an override ability, as the security benefits would be defeated (a module could override another module’s access restrictions). This would break Jigsaw's design principle labeled “fidelity”, where the intended flow from compile to assembly to test to distribution to run has a dependency graph that is universally consistent.
Satisfying such aims is difficult to achieve in environments other than those composed of software with a collectively coordinated life-cycle, such as the JVM itself. Carrying this outside of isolated islands of software would require a centralized and specially curated repository of some form. This was expounded upon in great detail on the JPMS experts list in 2015, but was not resolved.
 


Decentralized artifact repositories versus centralized module registry

It was originally implied that Maven would eventually evolve into a centralized repository for Jigsaw modules: 

"We're not trying to establish a new ecosystem of component distribution; we are, rather, trying to fit into existing ones, and in particular the existing Maven-based ecosystem." - Spec Lead in this post to jpms-spec-experts

 

The single, global module namespace, cannot be met by Maven Central without a fundamental and complex change to the way that submissions are curated.The reason for this is that today, a Maven artifact in Maven Central only has to resolve consistently relative to the set of artifacts it consumes, and (to a lesser extent because there's some flexibility here) the set of artifacts it is likely to coexist with.  This flexibility and relativity goes most of the way to mitigate the fact that many Maven artifacts have conflicting packages and version requirements.In the Jigsaw modular world a set of artifacts must resolve in a mutually consistent way, yet are 100% non-conflicting in terms of module specification. They also have to be 100% mutually consistent in terms of dependency mesh.  In order to have any sort of guarantee of consistency for any given module artifact, consistency must be guaranteed for all artifacts.The Maven Central model for artifacts fails in this regard for the exact same reason that there isn't, for example, one unified Linux package "mega-repository".  Packaging issues aside, there are many competing implementations of the same specifications and solutions to the same problems; these things have rippling effects on compatibility.  In order to create one, single, unified module repository for everything in Maven Central that is internally consistent would be a extremely large undertaking for the Java ecosystem as well as requiring major maintenance.
It was eventually acknowledged that Maven can’t meet this need(http://mail.openjdk.java.net/pipermail/jpms-spec-experts/2017-April/000667.html), yet the design and implementation constraints (described above under Distribution Model), which lead to a universal repository, still remain.

Impedance Mismatch with Maven

Since using Maven as a universal repository is not possibleusible, supporting Jigsaw requires Maven to port it’s override approach, in a way that works around the constraints Jigsaw imposes.
As mentioned above, Java libraries and applications are commonly composed of multiple different projects produced independently by multiple different parties. This can be readily observed by inspecting pom.xml files in the maven central repository.  A common problem encountered in assembling software produced by different parties with different lifecycles is a transitive dependency conflict. Maven provides multiple mechanisms to resolve these conflicts such as: excluding deps, overriding versions, utilizing Bill Of Materials (BOMs) etc. Additionally, the nature of the current Java classpath is such that even in the presence of a conflict (duplicate version, duplicate package etc), these cases may execute fine (although cause unexpected behaviour).This conflicts with Jigsaw’s design principles of “strong encapsulation” and “fidelity” where the descriptors of all artifacts are non-overridable and generated at compile-time in an augmentation unfriendly format (bytecode).In order for Maven to support the ability of a dependent to override a dependency in a complete and comprehensive manner, Maven would have to implement a post-build time module-info.class augmentation facility that remains consistent with already established mechanisms, and is capable of rewriting a full dependency tree. It’s not clear that such a facility will be available as it likely requires further research and potential implementation. In the meantime, Java developers will have to  resolve conflicting module-info.class files themselves, editing the bytecode of and repackaging dependencies on their own as necessary.

Example Scenarios

The following examples are non-exhaustive, and simplified to be illustrative of the types of conflicts developers would encounter

Duplicate spec dependencies

 

  • foo-lib requires apache-JSR-XXX-api (needs jsrxxx package)
  • bar-lib requires official-JSR-XXX-api  (needs jsrxxx package)
  • app requires foo-lib and bar-lib


With Maven and classpath the solution to the problem is to exclude one of the jsrxxx variants. However with Jigsaw you will get a compile (and runtime failure) until you open and edit foo-lib or bar-lib, and edit the descriptor. Alternatively you can create a Maven submodule artifact that builds a false alias, where it pretends to be one of the JSR API modules and “requires public” the other.

Dropped exported transitive

 

  • foo-lib requires transitive guava
  • bar-lib requires foo-lib (but not guava because it gets it for free with foo-lib dep, and it just works)


Some time later after a release, a user asks foo-lib’s maintainers  to stop exporting guava because it's not necessary, and that conflicts in some other way for their use case, foo-lib agrees they got this wrong and removes the transitive keyword.
Users of bar-lib now will get IllegalAccessError, because bar-lb no longer has access to guava’s packages. To fix this, users will have to either downgrade foo-lib (if it's even possible), or crack open and edit either foo-lib or bar-lib.

Over-eager shading

 

  1. foo-lib exports an API that exposes commons-collections classes, but it doesn’t yet support Jigsaw,  so it shades them classes and re-exports them (can’t rename the packages since the types are in the API signature used by users)
  2. commons-collection later decides to publish for jigsaw
  3. other libs use commons-collections, which then conflicts with anything that also uses foo-lib

 

Qualified Open

 

  1. foo-lib opens foo.beans to bar-lib (only bar-lib has access, works with 1.1)
  2. myapp uses foo-lib from maven
  3. zeta-lib uses new bar-lib 1.2, which has new methods it needs
  4. bar-lib 1.2 recently refactored and has moved its bean introspection code to a bar-lib-impl module
  5. myapp wants zeta-lib, but the upgrade of bar-lib breaks foo-lib
  6. myapp must now refactor foo-lib


One could argue bar-lib’s refactor is a mistake, and that they should have put reflection access in bar-lib and delegated back from bar-lib-impl. They could in turn argue that they shouldn't have to structure their system around reflective access, and foo-lib shouldn’t have qualified the open. Foo-lib could argue that qualification looked like a good practice.
Regardless of whether or not this is an error, and who is at fault, the software is already released, and the disruption can’t be undone until everyone upgrades.

Inadequate Compatibility Strategy

Many existing containers, frameworks, and applications are currently  incompatible with Jigsaw.  The recommended compatibility strategy is to run in a legacy mode where the class path is used, and ultimately either rewrite (for Jigsaw) or abandon all the incompatible artifacts.  In order to achieve the goals of the JSR, we do not believe this strict position is necessary.

 

     Quote:

"Ok. [Forget] Jigsaw compatibility. If doing so requires use of Java 9 tools, I'll have zero [...]  level interesting in support for next 2+ years" - Tatu Saloranta (Author of Jackson, WoodStox, ClassMate, TStore) Mar 16

Unusual API Constructs

Read edges (addReads)

Read edges are a new construct in Jigsaw.  They represent the consumer side of the exports/requires relationship; in effect, this is a sort of access permission.  However, the permission is not granted by the module being examined; rather, it is granted by the examiner.  This access-control feature does not solve any security use cases because a module can always grant itself permission to read anything.This concept is the source of a major compatibility problem.  Any code using reflection requires read access in order to function correctly.  Thus a decision was made to automatically add read access any time reflection is used on a member in a module other than the source module.  This further adds to the question of the function of this mechanism.It is unclear what user-visible problem is solved by this mechanism.

Unnamed Module(s)

The unnamed module of a ClassLoader is an architectural artifact that results from the mismatch of mapping classes between modules and class loaders.  Essentially, any class that isn't explicitly loaded into a module is placed in the unnamed module of the module's classloader (note that this implies that there are many unnamed modules).Unnamed modules have unique behavior compared to named modules.  They cannot opt in to reflection restrictions, and report no name or version on stack traces.

Conflation of paths as packages

The design of Jigsaw fully isolates and hides resources between modules.  This was done so that all linkage decisions between modules could be done solely on a package basis.  However, once inter-module resources was introduced, this conflation has become awkward, resulting in rules such as: "[...] The effective package name of a resource named by the string `"/foo/bar/baz"`, e.g., is 'foo.bar' [...] If a resource's effective package name is not a valid Java language package name (e.g., "META-INF.foo.bar") then the resource can be located by code in any module."We believe that modules should be able to control resource access on a similar basis to controlling package access, regardless of what path they are found in.  Representing a path name as an invalid package name is an awkward construct.

Module descriptors cannot be easily constructed

The only way to define new modules in software is by creating a module info descriptor.  There is a programmatic API for doing so which (by design) only allows a subset of possible module descriptor information to be specified.  This is a deliberate choice so that users do not exploit capabilities that deviate from the narrow set of approved use cases.In order to create descriptors which utilize the full range of Jigsaw capabilities, bytecodes must be generated and fed into the binary descriptor parser.  Generally speaking, special support libraries are required to do this.  This design deliberately creates difficulties for dynamic programs, frameworks, and containers.

Secondary API for loading classes and resources

Since the beginning of Java, locating resources and class content has been possible in two ways: by class and by class loader.  In order to transition to Jigsaw, classes which load resources from specific peers must be rewritten to use the methods on Module instead. This means that they will require distinct implementations for Java 8 versus Java 9 in order to achieve the same behavior.

"Optional" is everywhere

Most of the API changes in Jigsaw use java.util.Optional for getter return values as a substitute for null-checking.  Opinions of this feature vary widely, and general usage of it in this sort of context remains controversial.  Several changes to this class which are intended to address outstanding issues have been introduced in Java 9, and are not yet considered battle tested best practices.

 

     Quotes:

"Of course, people will do what they want. But we did have a clear intention when adding this feature, and it was not to be a general purpose Maybe or Some type, as much as many people would have liked us to do so. Our intention was to provide a limited mechanism for library method return types where there needed to be a clear way to represent "no result", and using null for such was overwhelmingly likely to cause errors.
    

"For example, you probably should never use it for something that returns an array of results, or a list of results; instead return an empty array or list. You should almost never use it as a field of something or a method parameter.
    

"I think routinely using it as a return value for getters would definitely be over-use." - Brian Goetz in this post to StackOverflow

 

     References:

Java 8 Optional: What's the Point?

Why java.util.Optional is broken

How Optional Breaks the Monad Laws and Why It Matters

What's Wrong in Java 8, Part IV: Monads

Primary Use Case Considerations and Strategies

Java Developer Strategies for Using Jigsaw In its current form

  1. Avoid using Jigsaw, and instead advise Jigsaw users that wish to consume your project to create their own local module to represent it.
  2. If you wish to support Jigsaw, and you wish your project to be usable by non-Jigsaw user's (Class-Path, Java 8, Java EE, OSGi, Eclipse, etc), then you can  produce a special Jigsaw-only build along with a traditional build of your library.
  3. Avoid relying exclusively on the package exclusion security capabilities of Jigsaw, as they can be disabled via the command line, and the traditional build mentioned in step 2 won’t utilize them.
  4. Reduce the number of Jigsaw dependencies in your project to the smallest possible number, since it’s unclear when or if the dependency and package conflict issues could be worked around by build systems such as Maven. Consider other strategies such as:
    1. Define your own modules locally to represent a dependency;
    2. Use the shade plugin to relocate packages and merge the dependency into your module (this avoids the duplicate concealed package issue, as well as version conflicts);
    3. Directly include the source of the dependency in your project.
  5. Be prepared to patch and convert dependencies which encounter compatibility issues with Jigsaw.
  6. Since there is no multi-module packaging system in Jigsaw, consider just defining everything in one module to simplify distribution.

Standalone SE Application Strategies for Using Jigsaw

  1. Avoid using Jigsaw; due to the issues presented in the document.
  2. If your application needs to run on Java 8 or earlier, your options will include (note that all options will require multiple launch mechanisms):
    1. Compile your source code as Java 8, but your module-info.java as Java 9 for every module shipped (note that this will require either multiple javac build invocations, or generating your own bytecode for the module-info.class as an additional step);
    2. Utilize two build pipeline stages that produce two separate target distributions (one Jigsaw Java 9 distribution and one Java 8-or-earlier distribution);
    3. Utilize two build stages as above and then construct a build script to merge the output for each jar to produce a single, multi-version JAR.
  3. Avoid relying exclusively on the package exclusion security capabilities of Jigsaw, as they can be disabled via the command line, and the traditional build mentioned in step 2 won’t utilize them.
  4. Reduce the number of Jigsaw dependencies in your project to the smallest possible, since it’s unclear when or if the dependency and package conflict issues could be worked around by build systems such as maven. Consider other strategies such as:
    1. Define your own modules locally to represent a dependency;
    2. Use the shade plugin to relocate packages and merge the dependency into your module (this avoids the duplicate concealed package issue, as well as version conflicts);
    3. Directly include the source of the dependency in your project;
    4. Define your own layer with a custom class-loading facility instead of the standard jigsaw launch to manage conflicts.
  5. Be prepared to patch and convert dependencies which encounter compatibility issues with Jigsaw and/or service loader namespace conflicts.
  6. Since there is no multi-module packaging system in Jigsaw, consider just defining everything in one module to simplify distribution.

 

Dynamic Runtime/Container Strategies for Supporting Jigsaw

Note this advice refers to any existing or new greenfield dynamic runtime environment

  1. If possible, discourage users from using Jigsaw, and encourage them to produce traditional packaging or utilize other modular technologies.  Due to several issues, many of which are presented in the document, this will likely help users of your runtime  produce a more reliable system.
  2. Due to the limitations expressed in this document (particularly around mutability and hierarchical layers), if support of Jigsaw is desired, it will likely require a complete reimplementation of the Jigsaw contracts utilizing a Jigsaw facade in front of a custom modular class-loading implementation.
  3. Due to restrictions in the APIs, in order to adequately support reflective dynamic frameworks, such as dependency injection at runtime, containers will likely need to modify the bytecode of module-info.class provided by the user to add appropriate qualified declarations open declarations.
  4. If support of lazy loading of packages and/or dynamic package extension is required, a runtime will need to take extreme measures, such as altering the Jigsaw implementation with an agent and/or utilizing Unsafe.
  5. In order to support custom serialization frameworks (e.g. Xstream), a runtime will need to bypass the package restriction facility in Jigsaw using Unsafe.
  6. Since plugin/deployment code built using Jigsaw might have a security model based on Jigsaw package restrictions, a container should try to wall off access as much as possible using any isolation mechanisms available based on the selected class loading strategy.
  7. Due to possible conflicts with module names, dependencies, and service names, a runtime should consider rewriting/redefining module-info.class.
  8. Due to all of the above, runtimes should advise their users that module metadata returned from Java reflection will not match expectation nor what is observed in a standalone Jigsaw execution.

 

So we are looking into some Jigsaw module issues and I needed to setup a build of the OpenJDK running on my development box which happens to be OSX 10.11.6. First to get the code and configure the project build I used:

 

 

  1. hg clone http://hg.openjdk.java.net/jdk9/jdk9 jdk9
  2. cd jdk9/
  3. bash ./get_source.sh
  4. bash ./configure --with-boot-jdk=/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home
  5. make all

 

Note that is step 4 I had to pass in the --with-boot-jdk option because I have JDK9 early access build installed under my /Library/Java/JavaVirtualMachines/ directory,  and apparently that cannot be used as the bootstrap JDK. When I tried it, the compiler crashes with an error about not understanding the MODULE annotation element or some such.

 

At this point I was able to start compiling, but I ran into two errors. The first was this error in the hotspot tree:

 

/Volumes/ScottBackup/Java9/jdk9/hotspot/src/jdk.hotspot.agent/macosx/native/libsaproc/MacosxDebuggerLocal.m:691:21: error: 'ePtAttachDeprecated' is deprecated: PT_ATTACH is deprecated. See PT_ATTACHEXC [-Werror,-Wdeprecated-declarations]

  if ((res = ptrace(PT_ATTACH, pid, 0, 0)) < 0) {

                    ^

/usr/include/sys/ptrace.h:85:19: note: expanded from macro 'PT_ATTACH'

#define PT_ATTACH       ePtAttachDeprecated     /* trace some running process */

                        ^

/usr/include/sys/ptrace.h:71:2: note: 'ePtAttachDeprecated' has been explicitly marked deprecated here

        ePtAttachDeprecated __deprecated_enum_msg("PT_ATTACH is deprecated. See PT_ATTACHEXC") = 10

 

It is a simple matter of changing PT_ATTACH to PT_ATTACHEXC as suggested by the warning. The attached hotspot.patch has the full diff.

 

The next 2 errors were in the jdk tree during the build of the libjavajpeg library. Here is the set of errors seen in the jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c file:

 

 

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:458:13: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

  { 0, ((-1)<<1) + 1, ((-1)<<2) + 1, ((-1)<<3) + 1, ((-1)<<4) + 1,

        ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:458:28: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

  { 0, ((-1)<<1) + 1, ((-1)<<2) + 1, ((-1)<<3) + 1, ((-1)<<4) + 1,

                       ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:458:43: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

  { 0, ((-1)<<1) + 1, ((-1)<<2) + 1, ((-1)<<3) + 1, ((-1)<<4) + 1,

                                      ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:458:58: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

  { 0, ((-1)<<1) + 1, ((-1)<<2) + 1, ((-1)<<3) + 1, ((-1)<<4) + 1,

                                                     ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:459:10: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<5) + 1, ((-1)<<6) + 1, ((-1)<<7) + 1, ((-1)<<8) + 1,

     ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:459:25: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<5) + 1, ((-1)<<6) + 1, ((-1)<<7) + 1, ((-1)<<8) + 1,

                    ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:459:40: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<5) + 1, ((-1)<<6) + 1, ((-1)<<7) + 1, ((-1)<<8) + 1,

                                   ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:459:55: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<5) + 1, ((-1)<<6) + 1, ((-1)<<7) + 1, ((-1)<<8) + 1,

                                                  ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:460:10: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<9) + 1, ((-1)<<10) + 1, ((-1)<<11) + 1, ((-1)<<12) + 1,

     ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:460:25: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<9) + 1, ((-1)<<10) + 1, ((-1)<<11) + 1, ((-1)<<12) + 1,

                    ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:460:41: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<9) + 1, ((-1)<<10) + 1, ((-1)<<11) + 1, ((-1)<<12) + 1,

                                    ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:460:57: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<9) + 1, ((-1)<<10) + 1, ((-1)<<11) + 1, ((-1)<<12) + 1,

                                                    ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:461:10: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<13) + 1, ((-1)<<14) + 1, ((-1)<<15) + 1 };

     ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:461:26: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<13) + 1, ((-1)<<14) + 1, ((-1)<<15) + 1 };

                     ~~~~^

/Volumes/ScottBackup/Java9/jdk9/jdk/src/java.desktop/share/native/libjavajpeg/jdhuff.c:461:42: error: shifting a negative signed value is undefined [-Werror,-Wshift-negative-value]

    ((-1)<<13) + 1, ((-1)<<14) + 1, ((-1)<<15) + 1 };

                                     ~~~~^

15 errors generated.

 

A similar set of errors is seen in the jdk/src/java.desktop/share/native/libjavajpeg/jdphuff.c file due to the same code being used there. The fix can be found in the attached jdk.patch file. So, add the following steps to apply the patches:

 

  1. save the hotspot.patch and jdk.patch file into jdk9 directory
  2. cd hotspot
  3. patch -p1 <../hotspot.patch
  4. cd ../jdk
  5. patch -p1 <../jdk.patch

 

At this point you should be able to run the make all step from within the jdk9 directory. On my iMac the build looked like:

 

 

[jdk9 546]$ time make all

Building target 'all' in configuration 'macosx-x86_64-normal-server-release'

Building JVM variant 'server' with features 'all-gcs cds compiler1 compiler2 dtrace fprof jni-check jvmci jvmti management nmt services vm-structs'

Compiling 8 files for BUILD_TOOLS_LANGTOOLS

Creating libjsig.dylib from 1 file(s)

Creating adlc from 13 file(s)

Compiling 2 files for BUILD_JVMTI_TOOLS

Parsing 1 properties into enum-like class for jdk.compiler

Compiling 16 properties into resource bundles for jdk.compiler

Compiling 19 properties into resource bundles for jdk.javadoc

Compiling 10 properties into resource bundles for jdk.jdeps

Compiling 7 properties into resource bundles for jdk.jshell

Compiling 115 files for BUILD_INTERIM_java.compiler

Compiling 390 files for BUILD_INTERIM_jdk.compiler

Creating libjvm.dylib from 695 file(s)

Creating libjvm.dylib from 19 file(s)

Creating gtestLauncher from 1 file(s)

Compiling 61 files for BUILD_INTERIM_jdk.jdeps

Compiling 450 files for BUILD_INTERIM_jdk.javadoc

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 155 files for BUILD_TOOLS_JDK

Compiling 141 files for BUILD_IDLJ

Compiling 6 files for BUILD_TOOLS_CORBA

Compiling 198 files for BUILD_INTERIM_RMIC

Note: /Volumes/ScottBackup/Java9/jdk9/corba/src/java.corba/share/classes/com/sun/tools/corba/se/idl/som/idlemit/MetaPragma.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 31 files for BUILD_JRTFS

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Creating support/jrt-fs.jar

Creating libsanity_SimpleNativeLib.dylib from 1 file(s)

Creating libsanity_SimpleNativeLib2.dylib from 1 file(s)

Creating sanity_SimpleNativeLauncher from 1 file(s)

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

Compiling 2 files for BUILD_BREAKITERATOR_BASE

Compiling 259 files for BUILD_BREAKITERATOR_LD

Compiling 11 properties into resource bundles for java.logging

Compiling 11 properties into resource bundles for java.management

Compiling 11 properties into resource bundles for java.base

Compiling 6 properties into resource bundles for java.base

Compiling 11 properties into resource bundles for jdk.jartool

Compiling 3 properties into resource bundles for jdk.jdi

Compiling 4 properties into resource bundles for jdk.jlink

Compiling 3 properties into resource bundles for jdk.jlink

Compiling 1 properties into resource bundles for jdk.jlink

Compiling 225 properties into resource bundles for jdk.localedata

Compiling 2 files for COMPILE_CREATE_SYMBOLS

Creating ct.sym classes

Creating support/symbols/ct.sym

Compiling 2853 files for java.base

Compiling 101 properties into resource bundles for java.desktop

Compiling 17 files for java.datatransfer

Compiling 34 files for java.logging

Compiling 6 files for java.annotations.common

Compiling 15 files for java.scripting

Compiling 116 files for java.compiler

Compiling 123 files for java.rmi

Compiling 1816 files for java.xml

Compiling 8 files for java.instrument

Compiling 30 files for java.security.sasl

Compiling 4 files for java.transaction

Compiling 110 files for java.httpclient

Compiling 44 files for jdk.httpserver

Compiling 21 files for java.smartcardio

Compiling 59 files for jdk.jvmstat

Compiling 145 files for jdk.charsets

Compiling 392 files for jdk.compiler

Compiling 8 files for jdk.crypto.ec

Compiling 66 files for jdk.dynalink

Compiling 46 files for jdk.internal.le

Compiling 46 files for jdk.internal.opt

Compiling 31 files for jdk.jartool

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1 files for jdk.jdwp.agent

Compiling 11 files for jdk.jstatd

Compiling 6 files for jdk.net

Compiling 2102 files for jdk.localedata

Compiling 1 files for jdk.pack200

Compiling 117 files for BUILD_NASGEN

Compiling 25 files for jdk.sctp

Compiling 10 files for jdk.unsupported

Compiling 595 files for jdk.scripting.nashorn

Compiling 177 files for jdk.vm.ci

Compiling 90 files for jdk.xml.dom

Compiling 14 files for jdk.zipfs

Running nasgen

Warning: generation and use of skeletons and static stubs for JRMP

is deprecated. Skeletons are unnecessary, and static stubs have

been superseded by dynamically generated stubs. Users are

encouraged to migrate away from using rmic to generate skeletons and static

stubs. See the documentation for java.rmi.server.UnicastRemoteObject.

Compiling 17 files for java.prefs

Warning: generation and use of skeletons and static stubs for JRMP

is deprecated. Skeletons are unnecessary, and static stubs have

been superseded by dynamically generated stubs. Users are

encouraged to migrate away from using rmic to generate skeletons and static

stubs. See the documentation for java.rmi.server.UnicastRemoteObject.

Compiling 1 files for java.compact1

Compiling 78 files for java.sql

Compiling 193 files for java.naming

Compiling 283 files for java.xml.crypto

Compiling 15 files for jdk.attach

Compiling 67 files for jdk.crypto.pkcs11

Compiling 38 files for jdk.jcmd

Compiling 451 files for jdk.javadoc

Compiling 124 files for jdk.jdeps

Compiling 250 files for jdk.jdi

Compiling 15 files for jdk.naming.dns

Compiling 7 files for jdk.naming.rmi

Compiling 1 files for java.compact2

Compiling 374 files for java.management

Compiling 211 files for java.security.jgss

Compiling 51 files for java.sql.rowset

Compiling 71 files for jdk.jlink

Compiling 2781 files for java.desktop

Compiling 14 files for jdk.security.jgss

Compiling 37 files for jdk.security.auth

Warning: generation and use of skeletons and static stubs for JRMP

is deprecated. Skeletons are unnecessary, and static stubs have

been superseded by dynamically generated stubs. Users are

encouraged to migrate away from using rmic to generate skeletons and static

stubs. See the documentation for java.rmi.server.UnicastRemoteObject.

Warning: generation and use of skeletons and static stubs for JRMP

is deprecated. Skeletons are unnecessary, and static stubs have

been superseded by dynamically generated stubs. Users are

encouraged to migrate away from using rmic to generate skeletons and static

stubs. See the documentation for java.rmi.server.UnicastRemoteObject.

Compiling 1 files for java.compact3

Compiling 24 files for jdk.management

Updating support/src.zip

WARNING: Generated file does not exist: /Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/hotspot/dist/docs/platform/jvmti/jvmti.html

# Running javadoc for images/docs/api/index.html

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 25 files for java.activation

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1480 files for java.corba

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1 files for java.se

Compiling 747 files for java.xml.bind

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 18 files for jdk.accessibility

Compiling 984 files for jdk.hotspot.agent

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 64 files for jdk.jconsole

Compiling 84 files for jdk.jshell

Compiling 5 files for jdk.jsobject

Compiling 14 files for jdk.policytool

Compiling 227 files for jdk.rmic

Compiling 10 files for jdk.scripting.nashorn.shell

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 776 files for jdk.xml.bind

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1227 files for java.xml.ws

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1 files for java.se.ee

Compiling 235 files for jdk.xml.ws

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1 files for BUILD_DEMO_APPLET_ArcTest

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/ArcTest/ArcTest.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_BarChart

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/BarChart/BarChart.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_Blink

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/Blink/Blink.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_CardTest

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/CardTest/CardTest.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_Clock

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/Clock/Clock.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_DitherTest

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/DitherTest/DitherTest.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_DrawTest

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/DrawTest/DrawTest.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_Fractal

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/Fractal/CLSFractal.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 2 files for BUILD_DEMO_APPLET_GraphicsTest

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_NervousText

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/NervousText/NervousText.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_SimpleGraph

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/SimpleGraph/GraphApplet.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 5 files for BUILD_DEMO_APPLET_SortDemo

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/SortDemo/SortItem.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_APPLET_SpreadSheet

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/SpreadSheet/SpreadSheet.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 3 files for BUILD_DEMO_CodePointIM

Updating support/demos/image/jfc/CodePointIM/src.zip

Compiling 2 files for BUILD_DEMO_MoleculeViewer

Updating support/demos/image/applets/MoleculeViewer/src.zip

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/MoleculeViewer/XYZApp.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 2 files for BUILD_DEMO_WireFrame

Updating support/demos/image/applets/WireFrame/src.zip

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/applets/WireFrame/ThreeD.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 1 files for BUILD_DEMO_SwingApplet

Updating support/demos/image/jfc/SwingApplet/src.zip

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/jfc/SwingApplet/SwingApplet.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Compiling 3 files for BUILD_DEMO_FileChooserDemo

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/jfc/FileChooserDemo/FileChooserDemo.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/jfc/FileChooserDemo/FileChooserDemo.java uses unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Updating support/demos/image/jfc/FileChooserDemo/src.zip

Compiling 4 files for BUILD_DEMO_Font2DTest

Updating support/demos/image/jfc/Font2DTest/src.zip

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: Some input files use unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 15 files for BUILD_DEMO_Metalworks

Updating support/demos/image/jfc/Metalworks/src.zip

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/jfc/Metalworks/MetalworksPrefs.java uses unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 2 files for BUILD_DEMO_Notepad

Updating support/demos/image/jfc/Notepad/src.zip

Compiling 5 files for BUILD_DEMO_SampleTree

Updating support/demos/image/jfc/SampleTree/src.zip

Compiling 8 files for BUILD_DEMO_TableExample

Updating support/demos/image/jfc/TableExample/src.zip

Note: Some input files use or override a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/jfc/TableExample/TableExample4.java uses unchecked or unsafe operations.

Note: Recompile with -Xlint:unchecked for details.

Compiling 1 files for BUILD_DEMO_TransparentRuler

Updating support/demos/image/jfc/TransparentRuler/src.zip

Compiling 3 files for BUILD_DEMO_jconsole-plugin

Updating support/demos/image/scripting/jconsole-plugin/src.zip

Compiling 3 files for BUILD_DEMO_FullThreadDump

Updating support/demos/image/management/FullThreadDump/src.zip

Compiling 2 files for BUILD_DEMO_JTop

Note: /Volumes/ScottBackup/Java9/jdk9/jdk/src/demo/share/management/JTop/JTop.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details.

Updating support/demos/image/management/JTop/src.zip

Compiling 1 files for BUILD_DEMO_MemoryMonitor

Updating support/demos/image/management/MemoryMonitor/src.zip

Compiling 2 files for BUILD_DEMO_VerboseGC

Updating support/demos/image/management/VerboseGC/src.zip

Creating libcompiledMethodLoad.dylib from 2 file(s)

Updating support/demos/image/jvmti/compiledMethodLoad/src.zip

Creating libgctest.dylib from 2 file(s)

Updating support/demos/image/jvmti/gctest/src.zip

Creating libheapViewer.dylib from 2 file(s)

Updating support/demos/image/jvmti/heapViewer/src.zip

Creating libversionCheck.dylib from 2 file(s)

Updating support/demos/image/jvmti/versionCheck/src.zip

Creating libheapTracker.dylib from 3 file(s)

Compiling 1 files for BUILD_DEMO_JVMTI_JAVA_heapTracker

Updating support/demos/image/jvmti/heapTracker/src.zip

Creating libminst.dylib from 3 file(s)

Compiling 1 files for BUILD_DEMO_JVMTI_JAVA_minst

Updating support/demos/image/jvmti/minst/src.zip

Creating libmtrace.dylib from 3 file(s)

Compiling 1 files for BUILD_DEMO_JVMTI_JAVA_mtrace

Updating support/demos/image/jvmti/mtrace/src.zip

Creating libwaiters.dylib from 5 file(s)

Updating support/demos/image/jvmti/waiters/src.zip

Creating support/demos/image/jfc/CodePointIM/CodePointIM.jar

Creating support/demos/image/applets/MoleculeViewer/MoleculeViewer.jar

Creating support/demos/image/applets/WireFrame/WireFrame.jar

Creating support/demos/image/jfc/SwingApplet/SwingApplet.jar

Creating support/demos/image/jfc/FileChooserDemo/FileChooserDemo.jar

Creating support/demos/image/jfc/Font2DTest/Font2DTest.jar

Creating support/demos/image/jfc/Metalworks/Metalworks.jar

Creating support/demos/image/jfc/Notepad/Notepad.jar

Creating support/demos/image/jfc/SampleTree/SampleTree.jar

Creating support/demos/image/jfc/TableExample/TableExample.jar

Creating support/demos/image/jfc/TransparentRuler/TransparentRuler.jar

Creating support/demos/image/scripting/jconsole-plugin/jconsole-plugin.jar

Creating support/demos/image/management/FullThreadDump/FullThreadDump.jar

Creating support/demos/image/management/JTop/JTop.jar

Creating support/demos/image/management/MemoryMonitor/MemoryMonitor.jar

Creating support/demos/image/management/VerboseGC/VerboseGC.jar

Creating support/demos/image/jvmti/heapTracker/heapTracker.jar

Creating support/demos/image/jvmti/minst/minst.jar

Creating support/demos/image/jvmti/mtrace/mtrace.jar

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

ld: warning: directory not found for option '-L/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/modules_libs/java.base/server'

# Running javadoc for images/docs/jdk/api/javadoc/doclet/index.html

# Running javadoc for images/docs/jdk/api/javadoc/old/doclet/index.html

# Running javadoc for images/docs/jdk/api/javadoc/old/taglet/index.html

# Running javadoc for images/docs/jre/api/plugin/dom/index.html

# Running javadoc for images/docs/jdk/api/jpda/jdi/index.html

# Running javadoc for images/docs/jre/api/security/jaas/spec/index.html

# Running javadoc for images/docs/jre/api/security/jgss/spec/index.html

# Running javadoc for images/docs/jre/api/security/smartcardio/spec/index.html

# Running javadoc for images/docs/jre/api/net/httpserver/spec/index.html

# Running javadoc for images/docs/jre/api/plugin/jsobject/index.html

# Running javadoc for images/docs/jre/api/management/extension/index.html

# Running javadoc for images/docs/jdk/api/attach/spec/index.html

# Running javadoc for images/docs/jdk/api/jconsole/spec/index.html

# Running javadoc for images/docs/jdk/api/jshell/index.html

# Running javadoc for images/docs/jdk/api/javac/tree/index.html

# Running javadoc for images/docs/jdk/api/nashorn/index.html

# Running javadoc for images/docs/jdk/api/dynalink/index.html

# Running javadoc for images/docs/jre/api/nio/sctp/spec/index.html

# Running javadoc for images/docs/jre/api/accessibility/jaccess/spec/index.html

# Running javadoc for images/docs/jre/api/net/socketoptions/spec/index.html

# Running javadoc for images/docs/jdk/api/jlink/index.html

Creating libjava.dylib from 61 file(s)

Creating libverify.dylib from 2 file(s)

Creating libfdlibm.dylib from 57 file(s)

Creating libzip.dylib from 5 file(s)

Creating libjimage.dylib from 6 file(s)

Creating libjli.dylib from 8 file(s)

Creating libjli_static.dylib from 8 file(s)

Creating libnet.dylib from 20 file(s)

Creating libosxsecurity.dylib from 1 file(s)

Creating libnio.dylib from 25 file(s)

Creating libJniVersion.dylib from 1 file(s)

Creating libUninitializedStrings.dylib from 1 file(s)

Creating libDefaultMethods.dylib from 1 file(s)

Creating libToStringTest.dylib from 1 file(s)

Creating libGetModule.dylib from 1 file(s)

Creating libSameObject.dylib from 1 file(s)

Creating libNativeSmallIntCalls.dylib from 1 file(s)

Creating libTest15FloatJNIArgs.dylib from 1 file(s)

Creating libCallsNative.dylib from 1 file(s)

Creating libTestDirtyInt.dylib from 1 file(s)

Creating libGetNamedModuleTest.dylib from 1 file(s)

Creating libSimpleClassFileLoadHook.dylib from 1 file(s)

Creating libNativeCallTest.dylib from 1 file(s)

Creating libJvmtiGetAllModulesTest.dylib from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base//libfdlibm.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating appletviewer from 1 file(s)

Creating jimage from 1 file(s)

Creating jlink from 1 file(s)

Creating java from 1 file(s)

Creating jmod from 1 file(s)

Creating keytool from 1 file(s)

Creating jspawnhelper from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating libinstrument.dylib from 12 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating libosx.dylib from 1 file(s)

Creating libosxapp.dylib from 5 file(s)

Creating libmlib_image.dylib from 50 file(s)

Creating libawt.dylib from 71 file(s)

Creating liblcms.dylib from 27 file(s)

Creating libjavajpeg.dylib from 46 file(s)

Creating libfontmanager.dylib from 129 file(s)

Creating libjawt.dylib from 1 file(s)

Creating libawt_lwawt.dylib from 72 file(s)

Creating libsplashscreen.dylib from 70 file(s)

Creating libosxui.dylib from 7 file(s)

Creating libjsound.dylib from 17 file(s)

Creating libmanagement.dylib from 10 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating libprefs.dylib from 1 file(s)

Creating librmi.dylib from 1 file(s)

Creating rmid from 1 file(s)

Creating rmiregistry from 1 file(s)

Creating jrunscript from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating images/jmods/java.se.jmod

Creating libj2gss.dylib from 3 file(s)

Creating libosxkrb5.dylib from 2 file(s)

Creating images/jmods/java.security.sasl.jmod

Creating libj2pcsc.dylib from 2 file(s)

Creating images/jmods/java.sql.jmod

Creating images/jmods/java.sql.rowset.jmod

Creating images/jmods/java.xml.jmod

Creating images/jmods/java.xml.crypto.jmod

Creating libattach.dylib from 1 file(s)

Creating images/jmods/jdk.charsets.jmod

Creating javac from 1 file(s)

Creating javah from 1 file(s)

Creating serialver from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating libsunec.dylib from 28 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating libj2pkcs11.dylib from 14 file(s)

Creating images/jmods/jdk.dynalink.jmod

Creating libsaproc.dylib from 5 file(s)

Creating jhsdb from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating images/jmods/jdk.httpserver.jmod

Creating images/jmods/jdk.internal.opt.jmod

Creating jar from 1 file(s)

Creating jarsigner from 1 file(s)

Creating javadoc from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating jinfo from 1 file(s)

Creating jmap from 1 file(s)

Creating jps from 1 file(s)

Creating jstack from 1 file(s)

Creating jstat from 1 file(s)

Creating jcmd from 1 file(s)

Creating jconsole from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating javap from 1 file(s)

Creating jdeps from 1 file(s)

Creating jdeprscan from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating jdb from 1 file(s)

Creating libdt_socket.dylib from 2 file(s)

Creating libjdwp.dylib from 42 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating jshell from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating images/jmods/jdk.jsobject.jmod

Creating jstatd from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating images/jmods/jdk.jvmstat.jmod

Creating images/jmods/jdk.localedata.jmod

Creating libmanagement_ext.dylib from 8 file(s)

Creating images/jmods/jdk.naming.dns.jmod

Creating images/jmods/jdk.naming.rmi.jmod

Creating libunpack.dylib from 7 file(s)

Creating pack200 from 1 file(s)

Creating unpack200 from 7 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating policytool from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating images/jmods/jdk.scripting.nashorn.jmod

Creating jjs from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating libjaas_unix.dylib from 1 file(s)

Creating images/jmods/jdk.security.jgss.jmod

Creating images/jmods/jdk.unsupported.jmod

Creating images/jmods/jdk.vm.ci.jmod

Creating images/jmods/jdk.xml.dom.jmod

Creating images/jmods/jdk.zipfs.jmod

Creating idlj from 1 file(s)

Creating orbd from 1 file(s)

Creating servertool from 1 file(s)

Creating tnameserv from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating images/jmods/java.se.ee.jmod

Creating images/jmods/java.transaction.jmod

Creating images/jmods/java.xml.bind.jmod

Creating images/jmods/java.xml.ws.jmod

Creating rmic from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating schemagen from 1 file(s)

Creating xjc from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Creating wsgen from 1 file(s)

Creating wsimport from 1 file(s)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

ld: warning: object file (/Volumes/ScottBackup/Java9/jdk9/build/macosx-x86_64-normal-server-release/support/native/java.base/libjli_static.a) was built for newer OSX version (10.11) than being linked (10.7)

Updating images/sec-bin.zip

Creating images/jmods/java.activation.jmod

Creating images/jmods/java.annotations.common.jmod

Creating images/jmods/java.compact1.jmod

Creating images/jmods/java.compact2.jmod

Creating images/jmods/java.compact3.jmod

Creating images/jmods/java.compiler.jmod

Creating images/jmods/java.datatransfer.jmod

Creating images/jmods/java.httpclient.jmod

Creating images/jmods/java.logging.jmod

Creating images/jmods/java.naming.jmod

Creating images/jmods/jdk.jlink.jmod

Creating images/jmods/java.instrument.jmod

Creating images/jmods/java.management.jmod

Creating images/jmods/java.prefs.jmod

Creating images/jmods/java.rmi.jmod

Creating images/jmods/java.scripting.jmod

Creating images/jmods/java.security.jgss.jmod

Creating images/jmods/java.smartcardio.jmod

Creating images/jmods/jdk.attach.jmod

Creating images/jmods/jdk.compiler.jmod

Creating images/jmods/jdk.crypto.ec.jmod

Creating images/jmods/jdk.crypto.pkcs11.jmod

Creating images/jmods/jdk.hotspot.agent.jmod

Creating images/jmods/jdk.internal.le.jmod

Creating images/jmods/jdk.jartool.jmod

Creating images/jmods/jdk.javadoc.jmod

Creating images/jmods/jdk.jcmd.jmod

Creating images/jmods/jdk.jconsole.jmod

Creating images/jmods/jdk.jdeps.jmod

Creating images/jmods/jdk.jdi.jmod

Creating images/jmods/jdk.jdwp.agent.jmod

Creating images/jmods/jdk.jshell.jmod

Creating images/jmods/jdk.jstatd.jmod

Creating images/jmods/jdk.management.jmod

Creating images/jmods/jdk.net.jmod

Creating images/jmods/jdk.pack200.jmod

Creating images/jmods/jdk.policytool.jmod

Creating images/jmods/jdk.scripting.nashorn.shell.jmod

Creating images/jmods/jdk.sctp.jmod

Creating images/jmods/jdk.security.auth.jmod

Creating images/jmods/java.corba.jmod

Creating images/jmods/jdk.rmic.jmod

Creating images/jmods/jdk.xml.bind.jmod

Creating images/jmods/jdk.xml.ws.jmod

Creating images/jmods/java.desktop.jmod

Creating images/jmods/jdk.accessibility.jmod

Creating images/jmods/java.base.jmod

Creating interim jimage

Compiling 3 files for BUILD_JIGSAW_TOOLS

Creating support/classlist.jar

Creating jre jimage

Creating jdk jimage

Stopping sjavac server

Finished building target 'all' in configuration 'macosx-x86_64-normal-server-release'

 

real 11m56.424s

user 18m48.669s

sys 3m39.355s

 

[jdk9 547]$ ./build/macosx-x86_64-normal-server-release/jdk/bin/java -version

openjdk version "9-internal"

OpenJDK Runtime Environment (build 9-internal+0-2016-09-14-155327.starksm.jdk9)

OpenJDK 64-Bit Server VM (build 9-internal+0-2016-09-14-155327.starksm.jdk9, mixed mode)

bg-banner-main.png

 

I'll be leading a hack night with Eurotech on Tuesday June 28, 6:30 pm at the upcoming DevNation. Mike Guerette details how to register in his blog post: DevNation 2016 evening workshop – Internet of Things – sponsored by Eurotech – Red Hat Developer Blog

 

Make sure you register as space is limited. You will be walking away with a customized sensor tag and other goodies in addition to learning how to integrated real time sensor data into your middleware.

rhiottag.png

Eurotech-Logo_800x141.png

Red Hat is a sponsor in the current iot.eclipse.org — Open IoT Developer challenge.

 

The key dates are:

  • Applications for the developer challenge are open now and will close November 23, 2015. To apply please complete the form below.
  • Hardware kits – On December 7, 2015, the teams behind the 15 best proposals will be offered $150 to buy hardware online, and access to special sponsors offerings.
  • Submissions of the final solution must be completed by February 26, 2015 (11.59pm PDT). Prizes will be awarded for the top 3 solutions.

 

Rules:

  • The challenge is open to any individual aged 18 years or older.
  • The solution submitted by the applicants must be a new solution created for this challenge.
  • More than one individual can participate in creating a solution but any prize will be awarded to the key contact who completed the entry form.

 

See the Open IoT Developer challenge page for the application and the full details.

Milestone.jpg

What do we have here? A BLE beacon scanner running Pidora on a raspberry pi connected to an LCD screen displaying it's beacon heartbeat count and RSSI on the first line, the time the scanner has been up on the second, the system load average on the third, and the total number of BLE events followed by the number of msgs sent to the ActiveMQ broker on the fourth. As I write this the 10,000,000 events milestone has been reached, 28 minutes after this picture was taken.

 

The beacon scanner software running on the Pi is a custom C++ application that integrates directly into the BlueZ bluetooth stack, and pull out the beacon BLE advertising events, packages them up into messages that are forwarded via a pluggable messaging provider interface. Currently the scanner has implemented providers over our Proton, AMQP and JMS/C++ bindings for ActiveMQ. There is also an Eclipse Paho MQTT implementation for talking MQTT with ActiveMQ.

 

This is a project I am working on with Burr Sutter for our upcoming IoT hackathon at this year's

Red Hat Summit and DevNation.

 

Hopefully you can stop by.

 

In the meantime, checkout the code in progress over here:

starksm64/NativeRaspberryPiBeaconParser · GitHub

Someone asked about "Can I write a method producer method without knowing the type until runtime?" and I was looking at something similar, trying to figure out what needs and Extension, so I mocked a little solution that uses a general producer method based on a specific qualifier annotation and a wrapping parameterized interface. See the forum thread for the original question, and this possible approach. While this may not be as transparent and generic as the user was asking for, it is a pretty simple way to achieve some form of dynamic production of injection values.

 

1. Create an qualifier annotation ConfigType:

package test.com.si.weld.dynproducer;


import javax.inject.Qualifier;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import static java.lang.annotation.ElementType.*;
import static java.lang.annotation.RetentionPolicy.*;


/**
 * @author Scott Stark
 * @version $Revision:$
 */
@Qualifier
@Retention(RUNTIME)
@Target({TYPE, METHOD, FIELD, PARAMETER})
public @interface ConfigType {
}

 

2. Create a simple parameterized interface and wrapper implementation to hold the user defined configuration class types:

package test.com.si.weld.dynproducer;

public interface IStoreSettings<T> {
    T getSettings();
}
package test.com.si.weld.dynproducer;


public class StoreSettingsWrapper<T> implements IStoreSettings<T> {
    private T settings;


    StoreSettingsWrapper(T settings) {
        this.settings = settings;
    }
    @Override
    public T getSettings() {
        return settings;
    }
    @Override
    public String toString() {
        return settings.toString();
    }
}

 

3. Create a general producer method that takes an InjectionPoint:

package test.com.si.weld.dynproducer;


import javax.enterprise.inject.Produces;
import javax.enterprise.inject.spi.BeanManager;
import javax.enterprise.inject.spi.InjectionPoint;
import java.lang.reflect.Constructor;
import java.lang.reflect.ParameterizedType;
import java.lang.reflect.Type;
import java.util.Arrays;


/**
 * @author Scott Stark
 * @version $Revision:$
 */
public class GenericConfigFactory {
    @Produces @ConfigType
    public IStoreSettings configProducer(InjectionPoint ip, BeanManager beanManager) throws Exception {
        dumpInjectionPoint(ip);
        ParameterizedType type = (ParameterizedType) ip.getType();
        Type[] typeArgs = type.getActualTypeArguments();
        Class<?> settingsClass = (Class<?>) typeArgs[0];
        Constructor ctor = settingsClass.getConstructor();
        Object settings = ctor.newInstance();
        StoreSettingsWrapper wrapper = new StoreSettingsWrapper(settings);
        return wrapper;
    }


    private void dumpInjectionPoint(InjectionPoint ip) {
        StringBuilder tmp = new StringBuilder("InjectionPoint");
        tmp.append("\n\tgetAnnotated:"+ip.getAnnotated());
        tmp.append(";\n\t getType:"+ip.getType());
        ParameterizedType type = (ParameterizedType) ip.getType();
        Type[] typeArgs = type.getActualTypeArguments();
        Type rawType = type.getRawType();
        tmp.append("\n\t\ttypeArgs: "+ Arrays.asList(typeArgs));
        tmp.append("\n\t\trawType: "+ rawType);
        tmp.append(";\n\t getQualifiers:"+ip.getQualifiers());
        tmp.append(";\n\t getBean:"+ip.getBean());
        tmp.append(";\n\t getMember:"+ip.getMember());
        tmp.append(";\n\t isDelegate:"+ip.isDelegate());
        tmp.append(";\n\t isTransient:"+ip.isTransient());
        System.out.println(tmp.toString());
    }
}

 

 

Now one can create different type of configuration objects and have them injected into a consumer:

 

package test.com.si.weld.dynproducer;


public @interface Config {
    public String value();
}

package test.com.si.weld.dynproducer;


import javax.inject.Inject;


public class ConfigUser {
    @Inject @ConfigType
    private IStoreSettings<MyDataStoreSettings> settings;
    @Inject @ConfigType
    private IStoreSettings<AnotherDataStoreSettings> settings2;


    @Override
    public String toString() {
        return "ConfigUser{" +
                "settings2=" + settings2 +
                ", settings=" + settings +
                '}';
    }
}

package test.com.si.weld.dynproducer;


import java.io.File;


public class MyDataStoreSettings {
    @Config("number.of.threads.key")
    int numberOfThreads = 1;
    @Config("root.folder.key")
    File rootFolder = new File("/tmp");


    @Override
    public String toString() {
        return "MyDataStoreSettings{" +
                "numberOfThreads=" + numberOfThreads +
                ", rootFolder=" + rootFolder +
                '}';
    }
}

package test.com.si.weld.dynproducer;


public class AnotherDataStoreSettings {
    @Config("thread.count.key")
    int threadCount = 2;
    @Config("tmp.path.key")
    String tmpPath = "/tmp";


    @Override
    public String toString() {
        return "AnotherDataStoreSettings{" +
                "threadCount=" + threadCount +
                ", tmpPath=" + tmpPath +
                '}';
    }
}

 

A little Arquillian testcase illustrates the behavior:

 

package test.com.si.weld.dynproducer;


import org.jboss.arquillian.container.test.api.Deployment;
import org.jboss.arquillian.junit.Arquillian;
import org.jboss.shrinkwrap.api.ShrinkWrap;
import org.jboss.shrinkwrap.api.asset.EmptyAsset;
import org.jboss.shrinkwrap.api.spec.JavaArchive;
import org.junit.Test;
import org.junit.runner.RunWith;


import javax.inject.Inject;


@RunWith(Arquillian.class)
public class DynProducerTest {
    @Deployment
    public static JavaArchive createDeployment()
    {
        JavaArchive archive = ShrinkWrap.create(JavaArchive.class)
          .addAsManifestResource(EmptyAsset.INSTANCE, "beans.xml");
        archive.addPackages(true, "test/com/si/weld/dynproducer");
        return archive;
    }


    @Inject
    ConfigUser configUser;


    @Test
    public void testDynProducer() {
        System.out.printf("configUser=%s\n", configUser);
    }
}

 

Outputs:

 

34 [main] INFO org.jboss.weld.Version - WELD-000900 1.1.9 (Final)
InjectionPoint
          getAnnotated:[field] @ConfigType @Inject private test.com.si.weld.dynproducer.ConfigUser.settings;
           getType:interface test.com.si.weld.dynproducer.IStoreSettings<class test.com.si.weld.dynproducer.MyDataStoreSettings>
                    typeArgs: [class test.com.si.weld.dynproducer.MyDataStoreSettings]
                    rawType: interface test.com.si.weld.dynproducer.IStoreSettings;
           getQualifiers:[@test.com.si.weld.dynproducer.ConfigType()];
           getBean:Managed Bean [class test.com.si.weld.dynproducer.ConfigUser] with qualifiers [@Any @Default];
           getMember:private test.com.si.weld.dynproducer.IStoreSettings test.com.si.weld.dynproducer.ConfigUser.settings;
           isDelegate:false;
           isTransient:false
InjectionPoint
          getAnnotated:[field] @ConfigType @Inject private test.com.si.weld.dynproducer.ConfigUser.settings2;
           getType:interface test.com.si.weld.dynproducer.IStoreSettings<class test.com.si.weld.dynproducer.AnotherDataStoreSettings>
                    typeArgs: [class test.com.si.weld.dynproducer.AnotherDataStoreSettings]
                    rawType: interface test.com.si.weld.dynproducer.IStoreSettings;
           getQualifiers:[@test.com.si.weld.dynproducer.ConfigType()];
           getBean:Managed Bean [class test.com.si.weld.dynproducer.ConfigUser] with qualifiers [@Any @Default];
           getMember:private test.com.si.weld.dynproducer.IStoreSettings test.com.si.weld.dynproducer.ConfigUser.settings2;
           isDelegate:false;
           isTransient:false
configUser=ConfigUser{settings2=AnotherDataStoreSettings{threadCount=2, tmpPath=/tmp}, settings=MyDataStoreSettings{numberOfThreads=1, rootFolder=/tmp}}

 

See the attached dynproducer.zip for the full source.

 

ironmaiden:SITesting starksm$ jar -tf dynproducer.zip

src/test/java/test/com/si/weld/dynproducer/

src/test/java/test/com/si/weld/dynproducer/AnotherDataStoreSettings.java

src/test/java/test/com/si/weld/dynproducer/Config.java

src/test/java/test/com/si/weld/dynproducer/ConfigType.java

src/test/java/test/com/si/weld/dynproducer/ConfigUser.java

src/test/java/test/com/si/weld/dynproducer/DynProducerTest.java

src/test/java/test/com/si/weld/dynproducer/GenericConfigFactory.java

src/test/java/test/com/si/weld/dynproducer/IStoreSettings.java

src/test/java/test/com/si/weld/dynproducer/MyDataStoreSettings.java

src/test/java/test/com/si/weld/dynproducer/StoreSettingsWrapper.java

So the next thing I wanted to do was to tie into the shutdown of the Weld framework, and the way to do that is via an javax.enterprise.inject.spi.Extension implementation. The default way to add an Extension is to use the META-INF/services mechanism, but since the custom startup class had access to the Weld instance, and this has an addExtension method, I leveraged that along with a couple of new annotations to have the custom startup handle all of the details. The new classes are:

 

package com.si.weld;


import org.jboss.weld.environment.se.Weld;
import org.jboss.weld.environment.se.WeldContainer;


import javax.enterprise.inject.spi.Extension;
import javax.enterprise.util.AnnotationLiteral;
import java.lang.annotation.Annotation;


/**
 * A weld startup class for use in Java SE environment
 *
 * @author Scott Stark
 * @version $Revision:$
 */
public class CustomWeldStartMain {


    /**
     * The entry point to the weld initialization
     * @param args - the
     *             [0] = the class name of WeldMain class to bootstrap
     *             [1..n] = the args to pass to the WeldMain.main(String...) method
     * @throws Exception
     */
    public static void main(String[] args) throws Exception {
        // Need at least one arg giving the WeldMain implementation class name
        if(args.length == 0) {
            throw new IllegalStateException("Non-zero arguments required, first argument must be main class name");
        }
        // Load the class to use as the main class
        String mainClassName = args[0];
        Class<?> mainClass = CustomWeldStartMain.class.getClassLoader().loadClass(mainClassName);
        if(WeldMain.class.isAssignableFrom(mainClass) == false) {
            throw new IllegalStateException(mainClassName+"does not implement WeldMain");
        }
        Class<WeldMain> weldMainClass = (Class<WeldMain>) mainClass;
        Weld weld = new Weld();
        // See if there is a main class extension
        Extension mainExtension = loadExtension(weldMainClass, weld);
        if(mainExtension instanceof DefaultWeldMainExtension) {
            // This is a hack to tie into extension to know which WeldMain impl was used
            DefaultWeldMainExtension tme = (DefaultWeldMainExtension) mainExtension;
            tme.setWeldMainClass(weldMainClass);
        }
        // Standard Weld bootstrap from org.jboss.weld.environment.se.StartMain
        WeldContainer weldContainer = weld.initialize();
        Annotation qualifier = new AnnotationLiteral<WeldMainType>() {};
        WeldMain main = weldContainer.instance().select(weldMainClass, qualifier).get();


        // Add the SE shutdown hook
        Runtime.getRuntime().addShutdownHook(new ShutdownHook(weld));
        // Call the WeldMain.main() entry point
        String[] subargs = new String[args.length-1];
        System.arraycopy(args, 1, subargs, 0, args.length-1);
        main.main(subargs);
    }


    /**
     * Look for an Extension implementation via a WeldMainExtension on the weldMainClass
     * @param weldMainClass
     * @param weld
     */
    private static Extension loadExtension(Class<WeldMain> weldMainClass, Weld weld) {
        Extension extension = null;
        WeldMainExtension extensionType = weldMainClass.getAnnotation(WeldMainExtension.class);
        if(extensionType != null) {
            try {
                Class<? extends Extension> c = extensionType.value();
                extension = c.newInstance();
                weld.addExtension(extension);
            } catch (Exception e) {
                e.printStackTrace();
            }
        }
        return extension;
    }


    static class ShutdownHook extends Thread {
        private final Weld weld;


        ShutdownHook(final Weld weld) {
            this.weld = weld;
        }


        public void run() {
            weld.shutdown();
        }
    }
}

package com.si.weld;


/**
 * A simple interface defining the Weld post bootstrap main entry point.
 *
 * @author Scott Stark
 * @version $Revision:$
 */
public interface WeldMain {
    public void main(String[] args) throws Exception;
    public void shutdown();
}

package com.si.weld;


import javax.enterprise.inject.spi.Extension;
import javax.inject.Qualifier;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import static java.lang.annotation.ElementType.*;
import static java.lang.annotation.RetentionPolicy.*;


/**
 * A WeldMainExtension qualifier
 * @author Scott Stark
 * @version $Revision:$
 */
@Qualifier
@Target({TYPE})
@Retention(RUNTIME)
public @interface WeldMainExtension {
    Class<? extends Extension> value();
}

package com.si.weld;


import javax.inject.Qualifier;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import static java.lang.annotation.ElementType.*;
import static java.lang.annotation.RetentionPolicy.*;


/**
 * A WeldMain qualifier used to identify which WeldMain bean was used
 * @author Scott Stark
 * @version $Revision:$
 */
@Qualifier
@Target({TYPE})
@Retention(RUNTIME)
public @interface WeldMainType {
}

package com.si.weld;


import javax.enterprise.context.spi.CreationalContext;
import javax.enterprise.event.Observes;
import javax.enterprise.inject.UnsatisfiedResolutionException;
import javax.enterprise.inject.spi.AnnotatedConstructor;
import javax.enterprise.inject.spi.AnnotatedField;
import javax.enterprise.inject.spi.AnnotatedMethod;
import javax.enterprise.inject.spi.AnnotatedType;
import javax.enterprise.inject.spi.Bean;
import javax.enterprise.inject.spi.BeanManager;
import javax.enterprise.inject.spi.BeforeShutdown;
import javax.enterprise.inject.spi.Extension;
import javax.enterprise.inject.spi.ProcessAnnotatedType;
import javax.enterprise.util.AnnotationLiteral;
import java.lang.annotation.Annotation;
import java.lang.reflect.Type;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Set;


/**
 * A Weld Extension that marks the ManagedBean used as the WeldMain instance with a
 * WeldMainType qualifier to allow it to be selected during shutdown, and its WeldMain.shutdown
 * method called.
 * @author Scott Stark
 * @version $Revision:$
 */
public class DefaultWeldMainExtension implements Extension {
    private Class<WeldMain> weldMainClass;
    private HashSet<WeldMainAnnotatedType> weldMainTypes = new HashSet<WeldMainAnnotatedType>();


    <T> void processAnnotatedType(@Observes ProcessAnnotatedType<T> pat) {
        AnnotatedType<T> atype = pat.getAnnotatedType();
        if(WeldMain.class.isAssignableFrom(atype.getJavaClass())) {
            //System.out.printf("TestMainExtension: scanning type: %s\n", atype.getJavaClass().getName());
            // If this is a WeldMain, add a wrapper to later hold a WeldMainType qualifier
            if(atype.getJavaClass().equals(weldMainClass)) {
                WeldMainAnnotatedType wrapped = new WeldMainAnnotatedType(atype);
                weldMainTypes.add(wrapped);
                pat.setAnnotatedType(wrapped);
            }
        }
    }
    void beforeShutdown(@Observes BeforeShutdown shutdown, final BeanManager beanManager){
        // Find the WeldMain to invoke shutdown on
        Annotation qualifier = new AnnotationLiteral<WeldMainType>() {};
        WeldMain main = getInstanceByType(beanManager, WeldMain.class, qualifier);
        main.shutdown();
    }


    protected <T> T getInstanceByType(BeanManager manager, Class<T> type, Annotation... bindings) {
        Set<Bean<?>> beans = manager.getBeans(type, bindings);
        final Bean<?> bean = manager.resolve(beans);
        if (bean == null) {
            throw new UnsatisfiedResolutionException("Unable to resolve a bean for " + type + " with bindings " + Arrays.asList(bindings));
        }
        CreationalContext<?> cc = manager.createCreationalContext(bean);
        return type.cast(manager.getReference(bean, type, cc));
    }


    public void setWeldMainClass(Class<WeldMain> weldMainClass) {
        this.weldMainClass = weldMainClass;
    }


    private static class WeldMainAnnotatedType<T> implements AnnotatedType<T> {
        private final AnnotatedType<T> type;
        private boolean markAsWeldMainType = true;
        WeldMainAnnotatedType(AnnotatedType<T> type){
            this.type = type;
        }


        @Override
        public boolean isAnnotationPresent(Class<? extends Annotation> annotationType) {
           return annotationType.equals(WeldMainType.class) ?
                   markAsWeldMainType : type.isAnnotationPresent(annotationType);
        }


        @Override
        public Set<AnnotatedConstructor<T>> getConstructors() {
            return type.getConstructors();
        }


        @Override
        public Set<AnnotatedField<? super T>> getFields() {
            return type.getFields();
        }


        @Override
        public Class<T> getJavaClass() {
            return type.getJavaClass();
        }


        @Override
        public Set<AnnotatedMethod<? super T>> getMethods() {
            return type.getMethods();
        }


        @Override
        public <T extends Annotation> T getAnnotation(Class<T> annotationType) {
            return type.getAnnotation(annotationType);
        }


        @Override
        public Set<Annotation> getAnnotations() {
            HashSet<Annotation> annotations = new HashSet<Annotation>(type.getAnnotations());
            if(markAsWeldMainType) {
                annotations.add(new AnnotationLiteral<WeldMainType>() {});
            }
            return annotations;
        }


        @Override
        public Type getBaseType() {
            return type.getBaseType();
        }


        @Override
        public Set<Type> getTypeClosure() {
            return type.getTypeClosure();
        }
    }


}

 

The new sample TestMain becomes:

 

package test.com.si.weld;

import com.si.weld.DefaultWeldMainExtension;
import com.si.weld.WeldMain;
import com.si.weld.WeldMainExtension;

import javax.enterprise.inject.Default;
import javax.inject.Singleton;
import java.util.Arrays;


/**
 * A minimalist WeldMain implementation
 *
 * @author Scott Stark
 * @version $Revision:$
 */
@Singleton
@Default
@WeldMainExtension(DefaultWeldMainExtension.class)
public class TestMain2 implements WeldMain {
    public TestMain2() {
    }


    public void main(String[] args) {
        System.out.printf("TestMain2.main(%s)\n", Arrays.asList(args));
    }


    @Override
    public void shutdown() {
        System.out.printf("TestMain2.shutdown()\n");
    }
}

 

 

Which when run using the CustomWeldStartMain main entry point produces:

 

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -Didea.launcher.port=7537 "..." com.intellij.rt.execution.application.AppMain com.si.weld.CustomWeldStartMain test.com.si.weld.TestMain2 arg2 arg3
34 [main] INFO org.jboss.weld.Version - WELD-000900 1.1.9 (Final)
296 [main] INFO org.jboss.weld.Bootstrap - WELD-000101 Transactional services not available. Injection of @Inject UserTransaction not available. Transactional observers will be invoked synchronously.
TestMain2.main([arg2, arg3])
TestMain2.shutdown()


Process finished with exit code 0

 

 

 

It was a whole lot more code just to add support for the invocation of the WeldMain.shutdown method, and most of that had to do with getting a qualifier on the ManagedBean that was used as the WeldMain implementation. Perhaps there is an easier way.

I was working with Weld framework in a Java SE environment largely through the IDE, and had found posts about using the org.jboss.weld.environment.se.StartMain class to bootstrap Weld from within Java SE, but I wanted more control over which class was targeted as the prescribed approach of having a ContainerInitialized event listener ala:

 

public class TestMain {
    public void main(@Observes ContainerInitialized event, @Parameters List parameters) {
        System.out.printf("TestMain.main called, parameters=%s\n", parameters);
    }
}

 

would result in every such listener in the ide classpath being called. I took the simple StartMain bootstrap code and created the following CustomWeldStartMain that accepts the name of the class to use as the post bootstrap entry point:

 

package com.si.weld;


import org.jboss.weld.environment.se.Weld;
import org.jboss.weld.environment.se.WeldContainer;


/**
 * A weld startup class for use in Java SE environment
 *
 * @author Scott Stark
 * @version $Revision:$
 */
public class CustomWeldStartMain {


    /**
     * The entry point to the weld initialization
     * @param args - the
     *             [0] = the class name of WeldMain class to bootstrap
     *             [1..n] = the args to pass to the WeldMain.main(String...) method
     * @throws Exception
     */
    public static void main(String[] args) throws Exception {
        // Need at least one arg giving the WeldMain implementation class name
        if(args.length == 0) {
            throw new IllegalStateException("Non-zero arguments required, first argument must be main class name");
        }
        // Load the class to use as the main class
        String mainClassName = args[0];
        Class<?> mainClass = CustomWeldStartMain.class.getClassLoader().loadClass(mainClassName);
        if(WeldMain.class.isAssignableFrom(mainClass) == false) {
            throw new IllegalStateException(mainClassName+"does not implement WeldMain");
        }
        Class<WeldMain> weldMainClass = (Class<WeldMain>) mainClass;
        // Standard Weld bootstrap from org.jboss.weld.environment.se.StartMain
        Weld weld = new Weld();
        WeldContainer weldContainer = weld.initialize();
        WeldMain main = weldContainer.instance().select(weldMainClass).get();
        // Add the SE shutdown hook
        Runtime.getRuntime().addShutdownHook(new ShutdownHook(weld));
        // Call the WeldMain.main() entry point
        String[] subargs = new String[args.length-1];
        System.arraycopy(args, 1, subargs, 0, args.length-1);
        main.main(subargs);
    }


    static class ShutdownHook extends Thread {
        private final Weld weld;


        ShutdownHook(final Weld weld) {
            this.weld = weld;
        }


        public void run() {
            weld.shutdown();
        }
    }
}

package com.si.weld;


/**
 * A simple interface defining the Weld post bootstrap main entry point.
 * 
 * @author Scott Stark
 * @version $Revision:$
 */
public interface WeldMain {
    public void main(String[] args) throws Exception;
}

 

Here is a sample test WeldMain entry point that is invoked when running from within the ide using a

 

package test.com.si.weld;


import com.si.weld.WeldMain;


import javax.inject.Singleton;
import java.util.Arrays;


/**
 * A minimalist WeldMain implementation
 *
 * @author Scott Stark
 * @version $Revision:$
 */
@Singleton
public class TestMain2 implements WeldMain {
    public void main(String[] args) {
        System.out.printf("TestMain2.main(%s)\n", Arrays.asList(args));
    }
}

 

 

To run this class, I setup a run configuration that specified com.si.weld.CustomWeldStartMain as the main class, and included test.com.si.weld.TestMain2 as the first program argument, with arg2, arg3 as the second argument. Running this within the ide produced:

 

/System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -Didea.launcher.port=7537 "..." com.intellij.rt.execution.application.AppMain com.si.weld.CustomWeldStartMain test.com.si.weld.TestMain2 arg2 arg3

45 [main] INFO org.jboss.weld.Version - WELD-000900 1.1.9 (Final)

263 [main] INFO org.jboss.weld.Bootstrap - WELD-000101 Transactional services not available. Injection of @Inject UserTransaction not available. Transactional observers will be invoked synchronously.

TestMain2.main([arg2, arg3])

Process finished with exit code 0

 

Maybe this is of general interest as an alternative StartMain?

With the announcment of EAP6, EAP is now available in the cloud environment provided by OpenShift. This post updates how to test the OpenShift jboss-eap-6.0 based cartridge standalone configuration in your own environment.

 

The configuration of the JBossAS7 server used by the OpenShift Express JBossAS cartridge is a simple modification of the jboss-eap-6.0 release which my be obtained from https://access.redhat.com/home

 

The contents of the jboss-eap-6.0 release need to be updated with the attached standalone.xml, standalone.conf, and the database driver modules. The exact steps would be:

  1. Create a demo directory to contain the files. I use /tmp/eap6conf, so if you use the same, no changes need to be made to the following directions or configuration files.
  2. download jboss-eap-6.0.zip from https://access.redhat.com/home using your subscription.
  3. unzip jboss-eap-6.0.zip
  4. Create a link from jbosseap-6.0 to jboss-eap-6.0, or copy the jboss-eap-6.0 directory to jbosseap-6.0
    1. ln -s jboss-eap-6.0 jbosseap-6.0
  5. Download the attachments on this blog
  6. unzip bin.zip to get the bin/{standalone.conf,standalone.sh} files
  7. cp bin/* to jbosseap-6.0/bin
  8. Edit the jbosseap-6.0/bin/standalone.conf file to change any of the environment variables in the first section to match your environment.
  9. unzip the modules.zip from within the jbosseap-6.0 directory to add the modules for mysql, mongodb, and switchyard to the server modules. (Note, the attached modules.zip does not contain the switchyard modules as this made the attachment too large. To build the full modules.zip, clone the https://github.com/openshift/jboss-as7-modules repo and do a 'gradle createModuleZip' to build it.)

 

OpenShift Environment Variables

There are several environment variables referenced in the standalone.xml from the OpenShift JBossAS7 cartridge standalone.xml and standalone.conf files. They are:

 

# OPENSHIFT_GEAR_NAME is the NAME of the application as passed to the rhc create app -a NAME … command
OPENSHIFT_GEAR_NAME

# OPENSHIFT_REPO_DIR is the git repository of the application deployment. For reproducing the standalone configuration
# it has no meaningful value as it is only used to pickup application specific overrides of the server
# configuration files.
OPENSHIFT_REPO_DIR = 

# OPENSHIFT_GEAR_DIR is the root directory of the application. 
OPENSHIFT_GEAR_DIR = 


# OPENSHIFT_GEAR_TYPE is jbosseap-6.0 for a jbossas cartridge. This is also used as the root of
# the jbosseap-6.0 application server associated with the application. In the OpenShift environment, each
# application is associated with a user that has a copy of the jboss-eap-6.0 server contents installed to ${OPENSHIFT_GEAR_DIR}${OPENSHIFT_GEAR_TYPE}, e.g., /tmp/demo/jbosseap-6.0.
OPENSHIFT_GEAR_TYPE = jbosseap-6.0

# OPENSHIFT_DB_TYPE is the type of the embedded database configured using the rhc-ctl-app command:
# rhc-ctl-app -a demo -e add-mysql-5.1
# This will be empty if no database cartridge has been embedded.
OPENSHIFT_DB_TYPE = 
# OPENSHIFT_DB_HOST gives the IP address the embedded database is listening on
OPENSHIFT_DB_HOST
# OPENSHIFT_DB_PORT gives the port the embedded database is listening on
OPENSHIFT_DB_PORT
# OPENSHIFT_DB_USERNAME gives the configured username to access the embedded database
OPENSHIFT_DB_USERNAME
# OPENSHIFT_DB_PASSWORD gives the configured username to access the embedded database
OPENSHIFT_DB_PASSWORD

# OPENSHIFT_INTERNAL_IP is the local ip address to bind the server's services to. Typically you would use
# localhost/127.0.0.1 for your local environment.
OPENSHIFT_INTERNAL_IP = 

# OPENSHIFT_JBOSS_CLUSTER is the ip address[port] for the TCPPING initial_hosts property used by the jgroups
# subsystem configuration. This is used when scaling of the application is enabled in OpenShift. For local, single server
# testing this will not be used.
OPENSHIFT_JBOSS_CLUSTER = 

# OPENSHIFT_JBOSS_CLUSTER_PROXY_PORT is the port used by the TCP groups protocol bind_port property.
OPENSHIFT_JBOSS_CLUSTER_PROXY_PORT = 7600

# OPENSHIFT_GEAR_DNS is the public dns name of the server hosting the application created by rhc app create -a … .
# For an application named demo, in a domain named jbossdev, the DNS name would be: demo-jbossdev.rhcloud.com.
# This is used by the jgroups subsystem configuration as the ip address/hostname for the external_addr property used to
# communicate with other jboss instances. For a local, single server testing this should be the same as the
# OPENSHIFT_INTERNAL_IP value.
OPENSHIFT_GEAR_DNS = 

# The node_profile value is derived from the type of node the application is running on. The default is small as that is the free
# offering which provide 500Mb of memory to the application. 
node_profile = small

 

The relevant values for these in the attached standalone.conf file that I tested with are:

 

#### Edit this variables for your test environment
export OPENSHIFT_GEAR_NAME= eap6conf
export OPENSHIFT_GEAR_DIR=/tmp/eap6conf/
export OPENSHIFT_GEAR_TYPE=jbosseap-6.0
export OPENSHIFT_INTERNAL_IP=127.0.0.1
export OPENSHIFT_GEAR_DNS=127.0.0.1
export node_profile=small
####

 

Running the OpenShift configuration is realized by executing the standalone.sh from your jbosseap-6.0/bin directory:

 

[800](ironmaiden eap6]) > cd /tmp/eap6conf/jbosseap-6.0/bin
[801](ironmaiden bin]) >./standalone.sh
=========================================================================


  JBoss Bootstrap Environment


  JBOSS_HOME: /tmp/eap6conf/jbosseap-6.0


  JAVA: /System/Library/Frameworks/JavaVM.framework/Versions/1.6/Home/bin/java


  JAVA_OPTS: -d32 -client -Xmx256m -XX:MaxPermSize=128m -XX:+AggressiveOpts -Dorg.apache.tomcat.util.LOW_MEMORY=true -Dorg.jboss.resolver.warning=true -Djava.net.preferIPv4Stack=true -Djboss.node.name=127.0.0.1 -Djgroups.bind_addr=127.0.0.1


=========================================================================


14:23:01,595 INFO  [org.jboss.modules] JBoss Modules version 1.1.2.GA-redhat-1
14:23:01,784 INFO  [org.jboss.msc] JBoss MSC version 1.0.2.GA-redhat-1
14:23:01,828 INFO  [org.jboss.as] JBAS015899: JBoss EAP 6.0.0.GA (AS 7.1.2.Final-redhat-1) starting
14:23:02,676 INFO  [org.xnio] XNIO Version 3.0.4.GA-redhat-1
14:23:02,676 INFO  [org.jboss.as.server] JBAS015888: Creating http management service using socket-binding (management-http)
14:23:02,684 INFO  [org.xnio.nio] XNIO NIO Implementation Version 3.0.4.GA-redhat-1
14:23:02,693 INFO  [org.jboss.remoting] JBoss Remoting version 3.2.8.GA-redhat-1
14:23:02,708 INFO  [org.jboss.as.logging] JBAS011502: Removing bootstrap log handlers
14:23:02,711 INFO  [org.jboss.as.configadmin] (ServerService Thread Pool -- 26) JBAS016200: Activating ConfigAdmin Subsystem
14:23:02,728 INFO  [org.jboss.as.clustering.infinispan] (ServerService Thread Pool -- 31) JBAS010280: Activating Infinispan subsystem.
14:23:02,752 INFO  [org.jboss.as.naming] (ServerService Thread Pool -- 38) JBAS011800: Activating Naming Subsystem
14:23:02,788 INFO  [org.jboss.as.security] (ServerService Thread Pool -- 44) JBAS013101: Activating Security Subsystem
14:23:02,797 INFO  [org.jboss.as.osgi] (ServerService Thread Pool -- 39) JBAS011906: Activating OSGi Subsystem
14:23:02,815 INFO  [org.jboss.as.security] (MSC service thread 1-4) JBAS013100: Current PicketBox version=4.0.9.Final-redhat-1
14:23:02,816 INFO  [org.jboss.as.webservices] (ServerService Thread Pool -- 48) JBAS015537: Activating WebServices Extension
14:23:02,822 INFO  [org.jboss.as.connector.logging] (MSC service thread 1-1) JBAS010408: Starting JCA Subsystem (JBoss IronJacamar 1.0.11.Final-redhat-1)
14:23:02,855 INFO  [org.jboss.as.connector.subsystems.datasources] (ServerService Thread Pool -- 27) JBAS010403: Deploying JDBC-compliant driver class org.h2.Driver (version 1.3)
14:23:02,878 INFO  [org.jboss.as.naming] (MSC service thread 1-5) JBAS011802: Starting Naming Service
14:23:02,891 INFO  [org.jboss.as.mail.extension] (MSC service thread 1-1) JBAS015400: Bound mail session [java:jboss/mail/Default]
14:23:03,072 INFO  [org.jboss.ws.common.management.AbstractServerConfig] (MSC service thread 1-3) JBoss Web Services - Stack CXF Server 4.0.4.GA-redhat-1
14:23:03,217 INFO  [org.apache.coyote.http11.Http11Protocol] (MSC service thread 1-1) Starting Coyote HTTP/1.1 on http-/127.0.0.1:8080
14:23:03,307 INFO  [org.jboss.as.connector.subsystems.datasources] (MSC service thread 1-6) JBAS010400: Bound data source [java:jboss/datasources/ExampleDS]
14:23:03,333 INFO  [org.jboss.as.server.deployment.scanner] (MSC service thread 1-6) JBAS015012: Started FileSystemDeploymentService for directory /tmp/eap6conf/jbosseap-6.0/standalone/deployments
14:23:03,350 INFO  [org.jboss.as.remoting] (MSC service thread 1-4) JBAS017100: Listening on 127.0.0.1:9999
14:23:03,350 INFO  [org.jboss.as.remoting] (MSC service thread 1-7) JBAS017100: Listening on 127.0.0.1:4447
14:23:03,409 INFO  [org.jboss.as] (Controller Boot Thread) JBAS015951: Admin console listening on http://127.0.0.1:9990
14:23:03,409 INFO  [org.jboss.as] (Controller Boot Thread) JBAS015874: JBoss EAP 6.0.0.GA (AS 7.1.2.Final-redhat-1) started in 2129ms - Started 134 of 214 services (79 services are passive or on-demand)

 

Diff of Configuration Files

The full difference from the base standalone.xml configuration you would see in a current jbosseap-6.0 cartridge application on OpenShift vs the standalone.xml seen in the jboss-eap-6.0/standalone/configuration/standalone.xml is shown below.

 

[855](ironmaiden eap6]) > diff -w .openshift/config/standalone.xml /home/git/JBossAS/Downloads/jboss-eap-6.0/standalone/configuration/standalone.xml 
4d3
< 
7,8d5
<                     <extension module="org.jboss.as.clustering.jgroups" />
<                     <extension module="org.jboss.as.cmp" />
14,15d10
<                     <extension module="org.jboss.as.jacorb" />
<                     <extension module="org.jboss.as.jaxr" />
20d14
<                     <extension module="org.jboss.as.jsr77" />
23d16
<                     <extension module="org.jboss.as.messaging" />
36d28
< 
37a30,46
>         <security-realms>
>             <security-realm name="ManagementRealm">
>                 <authentication>
>                     <local default-user="$local"/>
>                     <properties path="mgmt-users.properties" relative-to="jboss.server.config.dir"/>
>                 </authentication>
>             </security-realm>
>             <security-realm name="ApplicationRealm">
>                 <authentication>
>                     <local default-user="$local" allowed-users="*"/>
>                     <properties path="application-users.properties" relative-to="jboss.server.config.dir"/>
>                 </authentication>
>                 <authorization>
>                     <properties path="application-roles.properties" relative-to="jboss.server.config.dir"/>
>                 </authorization>
>             </security-realm>
>         </security-realms>
39c48
<             <native-interface>
---
>             <native-interface security-realm="ManagementRealm">
42c51
<             <http-interface>
---
>             <http-interface security-realm="ManagementRealm">
47d55
< 
50,51c58,63
<                               <!--console-handler name="CONSOLE"> <level name="INFO"/> <formatter> <pattern-formatter 
<                                         pattern="%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%E%n"/> </formatter> </console-handler -->
---
>             <console-handler name="CONSOLE">
>                 <level name="INFO"/>
>                 <formatter>
>                     <pattern-formatter pattern="%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%E%n"/>
>                 </formatter>
>             </console-handler>
54,55c66
<                                                   <pattern-formatter
<                                                             pattern="%d{yyyy/MM/dd HH:mm:ss,SSS} %-5p [%c] (%t) %s%E%n" />
---
>                     <pattern-formatter pattern="%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%E%n"/>
79c90
<                                                   <!--handler name="CONSOLE"/ -->
---
>                     <handler name="CONSOLE"/>
84d94
<                     <subsystem xmlns="urn:jboss:domain:cmp:1.0" />
88,91c98,99
<                                         <datasource jndi-name="java:jboss/datasources/ExampleDS"
<                                                   enabled="true" use-java-context="true" pool-name="H2DS">
<                                                   <connection-url>jdbc:h2:${jboss.server.data.dir}/test;DB_CLOSE_DELAY=-1
<                                                   </connection-url>
---
>                 <datasource jndi-name="java:jboss/datasources/ExampleDS" pool-name="ExampleDS" enabled="true" use-java-context="true">
>                     <connection-url>jdbc:h2:mem:test;DB_CLOSE_DELAY=-1</connection-url>
98,118d105
<                                         <datasource jndi-name="java:jboss/datasources/MysqlDS"
<                                                   enabled="${mysql.enabled}" use-java-context="true" pool-name="MysqlDS">
<                                                   <connection-url>jdbc:mysql://${env.OPENSHIFT_DB_HOST}:${env.OPENSHIFT_DB_PORT}/${env.OPENSHIFT_APP_NAME}
<                                                   </connection-url>
<                                                   <driver>mysql</driver>
<                                                   <security>
<                                                             <user-name>${env.OPENSHIFT_DB_USERNAME}</user-name>
<                                                             <password>${env.OPENSHIFT_DB_PASSWORD}</password>
<                                                   </security>
<                                         </datasource>
<                                         <datasource jndi-name="java:jboss/datasources/PostgreSQLDS"
<                                                   enabled="${postgresql.enabled}" use-java-context="true" pool-name="PostgreSQLDS"
<                                                   use-ccm="true">
<                                                   <connection-url>jdbc:postgresql://${env.OPENSHIFT_DB_HOST}:${env.OPENSHIFT_DB_PORT}/${env.OPENSHIFT_APP_NAME}
<                                                   </connection-url>
<                                                   <driver>postgresql</driver>
<                                                   <security>
<                                                             <user-name>${env.OPENSHIFT_DB_USERNAME}</user-name>
<                                                             <password>${env.OPENSHIFT_DB_PASSWORD}</password>
<                                                   </security>
<                                         </datasource>
121,130c108
<                                                             <xa-datasource-class>org.h2.jdbcx.JdbcDataSource
<                                                             </xa-datasource-class>
<                                                   </driver>
<                                                   <driver name="mysql" module="com.mysql.jdbc">
<                                                             <xa-datasource-class>com.mysql.jdbc.jdbc2.optional.MysqlXADataSource
<                                                             </xa-datasource-class>
<                                                   </driver>
<                                                   <driver name="postgresql" module="org.postgresql.jdbc">
<                                                             <xa-datasource-class>org.postgresql.xa.PGXADataSource
<                                                             </xa-datasource-class>
---
>                         <xa-datasource-class>org.h2.jdbcx.JdbcDataSource</xa-datasource-class>
136,138c114
<                               <deployment-scanner path="deployments"
<                                         relative-to="jboss.server.base.dir" scan-interval="5000"
<                                         deployment-timeout="300" />
---
>             <deployment-scanner path="deployments" relative-to="jboss.server.base.dir" scan-interval="5000"/>
141,144c117,118
<                               <spec-descriptor-property-replacement>false
<                               </spec-descriptor-property-replacement>
<                               <jboss-descriptor-property-replacement>true
<                               </jboss-descriptor-property-replacement>
---
>             <spec-descriptor-property-replacement>false</spec-descriptor-property-replacement>
>             <jboss-descriptor-property-replacement>true</jboss-descriptor-property-replacement>
151,152c125
<                                         <stateful default-access-timeout="5000" cache-ref="simple"
<                                                   clustered-cache-ref="clustered" />
---
>                 <stateful default-access-timeout="5000" cache-ref="simple"/>
155,158d127
<                               <mdb>
<                                         <resource-adapter-ref resource-adapter-name="hornetq-ra" />
<                                         <bean-instance-pool-ref pool-name="mdb-strict-max-pool" />
<                               </mdb>
161,166c130,131
<                                                   <strict-max-pool name="slsb-strict-max-pool"
<                                                             max-pool-size="20" instance-acquisition-timeout="5"
<                                                             instance-acquisition-timeout-unit="MINUTES" />
<                                                   <strict-max-pool name="mdb-strict-max-pool"
<                                                             max-pool-size="20" instance-acquisition-timeout="5"
<                                                             instance-acquisition-timeout-unit="MINUTES" />
---
>                     <strict-max-pool name="slsb-strict-max-pool" max-pool-size="20" instance-acquisition-timeout="5" instance-acquisition-timeout-unit="MINUTES"/>
>                     <strict-max-pool name="mdb-strict-max-pool" max-pool-size="20" instance-acquisition-timeout="5" instance-acquisition-timeout-unit="MINUTES"/>
171,174c136
<                                         <cache name="passivating" passivation-store-ref="file"
<                                                   aliases="SimpleStatefulCache" />
<                                         <cache name="clustered" passivation-store-ref="infinispan"
<                                                   aliases="StatefulTreeCache" />
---
>                 <cache name="passivating" passivation-store-ref="file" aliases="SimpleStatefulCache"/>
178,179d139
<                                         <cluster-passivation-store name="infinispan"
<                                                   cache-container="ejb" />
192d151
<                               <iiop enable-by-default="false" use-qualified-name="false" />
195,229c154,156
<                               <cache-container name="cluster" aliases="ha-partition"
<                                         default-cache="default">
<                                         <transport lock-timeout="60000" />
<                                         <replicated-cache name="default" mode="SYNC"
<                                                   batching="true">
<                                                   <locking isolation="REPEATABLE_READ" />
<                                         </replicated-cache>
<                               </cache-container>
<                               <cache-container name="web" aliases="standard-session-cache"
<                                         default-cache="repl">
<                                         <transport lock-timeout="60000" />
<                                         <replicated-cache name="repl" mode="ASYNC"
<                                                   batching="true">
<                                                   <file-store />
<                                         </replicated-cache>
<                                         <replicated-cache name="sso" mode="SYNC" batching="true" />
<                                         <distributed-cache name="dist" mode="ASYNC"
<                                                   batching="true" l1-lifespan="0">
<                                                   <file-store />
<                                         </distributed-cache>
<                               </cache-container>
<                               <cache-container name="ejb" aliases="sfsb sfsb-cache"
<                                         default-cache="repl">
<                                         <transport lock-timeout="60000" />
<                                         <replicated-cache name="repl" mode="ASYNC"
<                                                   batching="true">
<                                                   <eviction strategy="LRU" max-entries="10000" />
<                                                   <file-store />
<                                         </replicated-cache>
<                                         <!-- ~ Clustered cache used internally by EJB subsytem for managing the 
<                                                   client-mapping(s) of ~ the socketbinding referenced by the EJB remoting connector -->
<                                         <replicated-cache name="remote-connector-client-mappings"
<                                                   mode="SYNC" batching="true" />
<                                         <distributed-cache name="dist" mode="ASYNC"
<                                                   batching="true" l1-lifespan="0">
---
>             <cache-container name="hibernate" default-cache="local-query" module="org.jboss.as.jpa.hibernate:4">
>                 <local-cache name="entity">
>                     <transaction mode="NON_XA"/>
231,236c158,159
<                                                   <file-store />
<                                         </distributed-cache>
<                               </cache-container>
<                               <cache-container name="hibernate" default-cache="local-query"
<                                         module="org.jboss.as.jpa.hibernate:4">
<                                         <transport lock-timeout="60000" />
---
>                     <expiration max-idle="100000"/>
>                 </local-cache>
242,247c165
<                                         <invalidation-cache name="entity" mode="SYNC">
<                                                   <transaction mode="NON_XA" />
<                                                   <eviction strategy="LRU" max-entries="10000" />
<                                                   <expiration max-idle="100000" />
<                                         </invalidation-cache>
<                                         <replicated-cache name="timestamps" mode="ASYNC">
---
>                 <local-cache name="timestamps">
250c168
<                                         </replicated-cache>
---
>                 </local-cache>
253,260d170
<                     <subsystem xmlns="urn:jboss:domain:jacorb:1.2">
<                               <orb>
<                                         <initializers transactions="spec" security="on" />
<                               </orb>
<                     </subsystem>
<                     <subsystem xmlns="urn:jboss:domain:jaxr:1.1">
<                               <connection-factory jndi-name="java:jboss/jaxr/ConnectionFactory" />
<                     </subsystem>
282,316d191
<                     <subsystem xmlns="urn:jboss:domain:jgroups:1.1"
<                               default-stack="tcp">
<                               <stack name="tcp">
<                                         <transport type="TCP" socket-binding="jgroups-tcp">
<                                                   <property name="external_addr">${env.OPENSHIFT_GEAR_DNS}</property>
<                                                   <property name="external_port">${env.OPENSHIFT_JBOSS_CLUSTER_PROXY_PORT}
<                                                   </property>
<                                                   <property name="bind_port">7600</property>
<                                                   <property name="bind_addr">${env.OPENSHIFT_INTERNAL_IP}</property>
<                                         </transport>
<                                         <protocol type="TCPPING">
<                                                   <property name="timeout">3000</property>
<                                                   <property name="initial_hosts">${env.OPENSHIFT_JBOSS_CLUSTER}</property>
<                                                   <property name="port_range">0</property>
<                                                   <property name="num_initial_members">1</property>
<                                         </protocol>
<                                         <protocol type="MERGE2" />
<                                         <protocol type="FD" />
<                                         <protocol type="VERIFY_SUSPECT" />
<                                         <protocol type="BARRIER" />
<                                         <protocol type="pbcast.NAKACK" />
<                                         <protocol type="UNICAST2" />
<                                         <protocol type="pbcast.STABLE" />
<                                         <protocol type="pbcast.GMS" />
<                                         <protocol type="UFC" />
<                                         <protocol type="MFC" />
<                                         <protocol type="FRAG2" />
<                                         <protocol type="AUTH">
<                                                   <property name="auth_class">org.jgroups.auth.MD5Token</property>
<                                                   <property name="token_hash">SHA</property>
<                                                   <property name="auth_value">${env.OPENSHIFT_JBOSS_CLUSTER}</property>
<                                         </protocol>
<                                         <!--protocol type="pbcast.STATE_TRANSFER"/> <protocol type="pbcast.FLUSH"/ -->
<                               </stack>
<                     </subsystem>
324d198
<                     <subsystem xmlns="urn:jboss:domain:jsr77:1.0" />
330,403d203
<                     <subsystem xmlns="urn:jboss:domain:messaging:1.2">
<                               <hornetq-server>
<                                         <clustered>true</clustered>
<                                         <persistence-enabled>true</persistence-enabled>
<                                         <security-enabled>false</security-enabled>
<                                         <journal-file-size>102400</journal-file-size>
<                                         <journal-min-files>2</journal-min-files>
<                                         <connectors>
<                                                   <netty-connector name="netty" socket-binding="messaging" />
<                                                   <netty-connector name="netty-throughput"
<                                                             socket-binding="messaging-throughput">
<                                                             <param key="batch-delay" value="50" />
<                                                   </netty-connector>
<                                                   <in-vm-connector name="in-vm" server-id="0" />
<                                         </connectors>
<                                         <acceptors>
<                                                   <netty-acceptor name="netty" socket-binding="messaging" />
<                                                   <netty-acceptor name="netty-throughput"
<                                                             socket-binding="messaging-throughput">
<                                                             <param key="batch-delay" value="50" />
<                                                             <param key="direct-deliver" value="false" />
<                                                   </netty-acceptor>
<                                                   <in-vm-acceptor name="in-vm" server-id="0" />
<                                         </acceptors>
<                                         <!--broadcast-groups> <broadcast-group name="bg-group1"> <socket-binding>messaging-group</socket-binding> 
<                                                   <broadcast-period>5000</broadcast-period> <connector-ref>netty</connector-ref> 
<                                                   </broadcast-group> </broadcast-groups> <discovery-groups> <discovery-group 
<                                                   name="dg-group1"> <socket-binding>messaging-group</socket-binding> <refresh-timeout>10000</refresh-timeout> 
<                                                   </discovery-group> </discovery-groups> <cluster-connections> <cluster-connection 
<                                                   name="my-cluster"> <address>jms</address> <connector-ref>netty</connector-ref> 
<                                                   <discovery-group-ref discovery-group-name="dg-group1"/> </cluster-connection> 
<                                                   </cluster-connections -->
<                                         <address-settings>
<                                                   <!--default for catch all -->
<                                                   <address-setting match="#">
<                                                             <dead-letter-address>jms.queue.DLQ</dead-letter-address>
<                                                             <expiry-address>jms.queue.ExpiryQueue</expiry-address>
<                                                             <redelivery-delay>0</redelivery-delay>
<                                                             <redistribution-delay>1000</redistribution-delay>
<                                                             <max-size-bytes>10485760</max-size-bytes>
<                                                             <address-full-policy>BLOCK</address-full-policy>
<                                                             <message-counter-history-day-limit>10
<                                                             </message-counter-history-day-limit>
<                                                   </address-setting>
<                                         </address-settings>
<                                         <jms-connection-factories>
<                                                   <connection-factory name="InVmConnectionFactory">
<                                                             <connectors>
<                                                                       <connector-ref connector-name="in-vm" />
<                                                             </connectors>
<                                                             <entries>
<                                                                       <entry name="java:/ConnectionFactory" />
<                                                             </entries>
<                                                   </connection-factory>
<                                                   <connection-factory name="RemoteConnectionFactory">
<                                                             <connectors>
<                                                                       <connector-ref connector-name="netty" />
<                                                             </connectors>
<                                                             <entries>
<                                                                       <entry name="java:jboss/exported/jms/RemoteConnectionFactory" />
<                                                             </entries>
<                                                   </connection-factory>
<                                                   <pooled-connection-factory name="hornetq-ra">
<                                                             <transaction mode="xa" />
<                                                             <connectors>
<                                                                       <connector-ref connector-name="in-vm" />
<                                                             </connectors>
<                                                             <entries>
<                                                                       <entry name="java:/JmsXA" />
<                                                             </entries>
<                                                   </pooled-connection-factory>
<                                         </jms-connection-factories>
<                               </hornetq-server>
<                     </subsystem>
419,422c219,220
<                                         <capability name="org.apache.felix.configadmin"
<                                                   startlevel="1" />
<                                         <capability name="org.jboss.as.osgi.configadmin"
<                                                   startlevel="1" />
---
>                 <capability name="org.apache.felix.configadmin" startlevel="1"/>
>                 <capability name="org.jboss.as.osgi.configadmin" startlevel="1"/>
427c225
<                               <connector name="remoting-connector" socket-binding="remoting" />
---
>             <connector name="remoting-connector" socket-binding="remoting" security-realm="ApplicationRealm"/>
462,463c260
<                               <recovery-environment socket-binding="txn-recovery-environment"
<                                         status-socket-binding="txn-status-manager" />
---
>             <recovery-environment socket-binding="txn-recovery-environment" status-socket-binding="txn-status-manager"/>
466,471c263,265
<                     <subsystem xmlns="urn:jboss:domain:web:1.1"
<                               default-virtual-server="default-host" native="false">
<                               <connector name="http" protocol="HTTP/1.1" scheme="http"
<                                         socket-binding="http" />
<                               <virtual-server name="default-host"
<                                         enable-welcome-root="false">
---
>         <subsystem xmlns="urn:jboss:domain:web:1.1" default-virtual-server="default-host" native="false">
>             <connector name="http" protocol="HTTP/1.1" scheme="http" socket-binding="http"/>
>             <virtual-server name="default-host" enable-welcome-root="true">
472a267
>                 <alias name="example.com"/>
477c272
<                               <wsdl-host>${env.OPENSHIFT_INTERNAL_IP}</wsdl-host>
---
>             <wsdl-host>${jboss.bind.address:127.0.0.1}</wsdl-host>
480,483c275,276
<                                         <pre-handler-chain name="recording-handlers"
<                                                   protocol-bindings="##SOAP11_HTTP ##SOAP11_HTTP_MTOM ##SOAP12_HTTP ##SOAP12_HTTP_MTOM">
<                                                   <handler name="RecordingHandler"
<                                                             class="org.jboss.ws.common.invocation.RecordingServerHandler" />
---
>                 <pre-handler-chain name="recording-handlers" protocol-bindings="##SOAP11_HTTP ##SOAP11_HTTP_MTOM ##SOAP12_HTTP ##SOAP12_HTTP_MTOM">
>                     <handler name="RecordingHandler" class="org.jboss.ws.common.invocation.RecordingServerHandler"/>
489d281
< 
492c284
<                               <loopback-address value="${env.OPENSHIFT_INTERNAL_IP}" />
---
>             <inet-address value="${jboss.bind.address.management:127.0.0.1}"/>
495c287
<                               <loopback-address value="${env.OPENSHIFT_INTERNAL_IP}" />
---
>             <inet-address value="${jboss.bind.address:127.0.0.1}"/>
496a289
>         <!-- TODO - only show this if the jacorb subsystem is added  -->
498,500c291,295
<                               <!-- Used for IIOP sockets in the standarad configuration. To secure JacORB 
<                                         you need to setup SSL -->
<                               <loopback-address value="${env.OPENSHIFT_INTERNAL_IP}" />
---
>             <!--
>               ~  Used for IIOP sockets in the standard configuration.
>               ~                  To secure JacORB you need to setup SSL 
>               -->
>             <inet-address value="${jboss.bind.address.unsecure:127.0.0.1}"/>
503,510c298,302
< 
<           <socket-binding-group name="standard-sockets"
<                     default-interface="public" port-offset="0">
<                     <socket-binding name="management-native" interface="management"
<                               port="9999" />
<                     <socket-binding name="management-http" interface="management"
<                               port="9990" />
< 
---
>     <socket-binding-group name="standard-sockets" default-interface="public" port-offset="${jboss.socket.binding.port-offset:0}">
>         <socket-binding name="management-native" interface="management" port="${jboss.management.native.port:9999}"/>
>         <socket-binding name="management-http" interface="management" port="${jboss.management.http.port:9990}"/>
>         <socket-binding name="management-https" interface="management" port="${jboss.management.https.port:9443}"/>
>         <socket-binding name="ajp" port="8009"/>
513,523c305
<                     <socket-binding name="jacorb" interface="unsecure"
<                               port="3528" />
<                     <socket-binding name="jacorb-ssl" interface="unsecure"
<                               port="3529" />
<                     <socket-binding name="jgroups-tcp" port="7600" />
<                     <socket-binding name="messaging" port="5445" />
<                     <!--socket-binding name="messaging-group" multicast-address="${jboss.messaging.group.address:231.7.7.7}" 
<                               multicast-port="${jboss.messaging.group.port:9876}"/ -->
<                     <socket-binding name="messaging-throughput" port="5455" />
<                     <socket-binding name="osgi-http" interface="management"
<                               port="8090" />
---
>         <socket-binding name="osgi-http" interface="management" port="8090"/>

Filter Blog

By date:
By tag: