1 2 Previous Next 19 Replies Latest reply: Jan 7, 2011 5:43 PM by Andrew Rubinger RSS

Arquillian Design Questions:

Jeremy Norris Newbie

I have a few comments/questions about Arquillian after some initial research.

 

What are the design goals for Arquillian? It is designed to a) bootstrap a container in process, or b) connect to an already running standalone container and programmatically drive that?  Or perhaps both?  (I noticed for the initial TestNG examples, it seems to only showcase b), or perhaps I'm missing something).

 

In my opinion, from a testing perspective, a) is the most desirable since this allows the tests to be very portable and self-contained.  It also makes debugging easier.  I'm sure this is in the works as this is in essence the purpose of the EmbeddedAS project.  b) is useful too as a deployment/release management tool, but not as much for automated testing IMO.

 

Arquillian seems to abstract the container lifecycle and deployment (like a modern day Cargo).  However, how is this configured?  In other words how do you select the AS implementation you wish to use from the TestNG level?

 

Regarding ShrinkWrap, this is a very good idea - enabling access to existing AS standard deployment abstractions (eg: jar/war/ear) that previously required assembling the physical archives.  This enables use from places that were awkward before (eg. straight from the IDE).  However, is there an implementation or adaptor that exists that uses maven's project meta-data to define the shrink-wrap archive?  In many cases, I think developers have a war or jar defined with maven that they want to deploy/test straight from their IDE (eg. think embedded-glassfish:run) without duplicating all their project structure in a shrink-wrap builder.

 

This project is looking great.  Thanks to everyone involved.

 

-Jeremy Norris

  • 1. Re: Arquillian Design Questions:
    Dan Allen Master

    Jeremy, I'm glad you've raised these questions because they get right to the core objectives of this project, which we are currently working to refine. We invite you to join us in steering the solutions Arquillian provides with the testing needs of project development teams.

     

    With that said, let's get to the responses.

     

    What are the design goals for Arquillian?

     

    Simply put, Arquillian strives to make true integration testing no more difficult than basic unit testing. You aren't really testing your component or subsystem until you test it in situ. I call this true integration testing because it's all to easy to create a test that puts on a good show but doesn't provide any real guarantee that the code under test functions properly in a production environment. The show typically involves mock components and/or configurations that cater to the test. Such "unit tests" can't verify that the declarative services kick in as they should. While unit tests certainly have value in quickly testing algorithms and business calculations within methods, there still need to be tests that exercise the component as a complete service. So Arquillian is about putting your component to work and asserting the correctness of that work.

     

    It is designed to a) bootstrap a container in process, or b) connect to an already running standalone container and programmatically drive that?  Or perhaps both?

     

    Both. In fact, there are three types of environments that Arquillian can us.

     

    1. A standalone container
    2. An embedded container
    3. A bootstrapped framework, such as Weld SE (CDI)

     

    Arquillian provides SPIs that handle each of the tasks involved in controlling the runtime environment, executing the tests and aggregating the results. So in theory, you can support just about any environment that can be controlled with the set of hooks you are given. The implementations provided so far are focused on Java EE containers (JBoss AS, GlassFish), servlet containers (Tomcat, Jetty) and a pure CDI environment (Weld SE).

     

    In my opinion, from a testing perspective, a) is the most desirable since this allows the tests to be very portable and self-contained.  It also makes debugging easier.  I'm sure this is in the works as this is in essence the purpose of the EmbeddedAS project.  b) is useful too as a deployment/release management tool, but not as much for automated testing IMO.

     

    Absolutely. The biggest hurdle here is the availability of an embedded Java EE container that has all the features of a true Java EE container (meaning, a container which is compliant). Embedded GlassFish is pretty close, and JBoss Embedded AS is still being shaped.

     

    We must acknowledge that the biggest driving force behind the preference to use Jetty for development is the Maven jetty:run command. People love being able to deploy their application in place, without any prerequisite install step. We need to be able to match that experience, especially in a testing environment. We should strive to bump Jetty by providing a more compelling option in the form of a fast, embedded Java EE container. But, it's important to recognize that being able to test in the real Java EE container is also very valuable, especially to the QA team.

     

    Arquillian seems to abstract the container lifecycle and deployment (like a modern day Cargo).  However, how is this configured?  In other words how do you select the AS implementation you wish to use from the TestNG level?

     

    Indeed, Arquillian a much more well designed architecture than what Cargo provides. And, as a result, we should see the number of implementated environments grow rapidly.

     

    Currently, you configure your target environment using profiles (e.g., Maven profiles) to swap which implementation is on the classpath when the test runner (i.e., TestNG or JUnit) is executed.

    Regarding ShrinkWrap, this is a very good idea - enabling access to existing AS standard deployment abstractions (eg: jar/war/ear) that previously required assembling the physical archives.  This enables use from places that were awkward before (eg. straight from the IDE).

     

    Absolutely

     

    However, is there an implementation or adaptor that exists that uses maven's project meta-data to define the shrink-wrap archive?  In many cases, I think developers have a war or jar defined with maven that they want to deploy/test straight from their IDE (eg. think embedded-glassfish:run) without duplicating all their project structure in a shrink-wrap builder.

     

    That's an interesting idea and something that should definitely be considered. But let me clarify something about ShrinkWrap's role in Arquillian to hopefully give you a better idea of what we are trying to achieve.

     

    One huge advantage ShrinkWrap brings to Arquillian is classpath control. The classpath of a test run has traditionally been a kitchen sink of all production classes and resources with the test classes and resources layered on top. ShrinkWrap enables you to create micro deployments that can focus on the interaction between a logical grouping of classes. Within that grouping you get the self-assembly of services provided by Java EE; the integration which is being tested.

     

    Hope that gives you a good idea of where we are going with Arquillian. Clearly what is missing at this point is a user guide. That's a job that currently rests on my plate.

  • 2. Re: Arquillian Design Questions:
    Andrew Rubinger Master
    Regarding ShrinkWrap, this is a very good idea - enabling access to existing AS standard deployment abstractions (eg: jar/war/ear) that previously required assembling the physical archives.  This enables use from places that were awkward before (eg. straight from the IDE).  However, is there an implementation or adaptor that exists that uses maven's project meta-data to define the shrink-wrap archive?  In many cases, I think developers have a war or jar defined with maven that they want to deploy/test straight from their IDE (eg. think embedded-glassfish:run) without duplicating all their project structure in a shrink-wrap builder.

    I was thinking about this just the other day.  Allow me to stream-of-consciousness some ideas:

     

    We need an adaptor of some sort, otherwise we'll have the situation where folks are integration-testing artifacts that they manually assemble in ShrinkWrap, while the archive that's truly deployed is coming from the Maven build cycle.  I don't think it'll be too difficult to create some wrapper like:

     

    JavaArchive archive = Maven2Archives.create("groupId","artifactId","classifier","type",JavaArchive.class);

     

    Under the hood we can get the original archive (either from the user's local repository or the target directory?) and use the ZipImporter to make a ShrinkWrap archive out of it.

     

    This brings up some other questions:

     

    1. The test will have to take place in the proper Maven lifecycle phase (ie. "integration-test", after packaging into project-dir/target, or even in "deploy", after the artifact has been installed into the user's local repository?  This gets iffy).
    2. How does the runtime code know where the project root is?  Require the user to specify it, relative to TestClass.class.getProtectionDomain().getCodeSource().getLocation() ?  Ick.

     

    Maybe we could instead invert the responsibility.  Instead of ShrinkWrap creating from Maven metadata, some tool could specify something to the test runtime:

     

    @CurrentArtifact(classifier="jar")
    JavaArchive theThingImBuildingInThisProject;
    
    @Maven2Artifact("groupId:artifactId:version:classifier:type")
    JavaArchive somethingIDependOnFromLocalMaven2Repository;
    

     

    Here we don't define the deployment at all, but we expect something to come along and inject it.  Again we need to know some key information in order to locate the right JARs to inject.

     

    Other thoughts appreciated either here or in the ShrinkWrap forums.


    S,

    ALR

  • 3. Re: Arquillian Design Questions:
    Aslak Knutsen Master

    Since we're discussing maven artifacts and shrinkwrap..

     

    Even tho it's nice if people only used what the container provides, they tend to have other dependencies.. and often these are controlled by Maven.

     

    We should support a way of adding maven based dependencies to a war/ear..

     

    archive.addLibrary(Maven.library("groupid", "artifactid", "version"))
    

     

    If we extended the import api to include Asset, we could have:

    Archives.create("my.jar", Importer.class).import(Maven.library("groupid", "artifactid", "version")).addResource("")
    

     

     

  • 4. Re: Arquillian Design Questions:
    Andrew Rubinger Master

    Good idea, though this one's more difficult.  We'd have to hook into the Maven dependency resolver somehow.

     

    As a workaround first pass, users can assemble the dependencies using the Assembly plugin, then import the artifact with that classifier.

     

    Anyway, I've created a space for us to brain dump these ideas for further refinement later:

     

    http://community.jboss.org/docs/DOC-14722

     

    S,

    ALR

  • 5. Re: Arquillian Design Questions:
    Dan Allen Master
    This brings up some other questions:

     

    1. The test will have to take place in the proper Maven lifecycle phase (ie. "integration-test", after packaging into project-dir/target, or even in "deploy", after the artifact has been installed into the user's local repository?  This gets iffy).
    2. How does the runtime code know where the project root is?  Require the user to specify it, relative to TestClass.class.getProtectionDomain().getCodeSource().getLocation() ?  Ick.

     

    Typically, if you want to work with the artifact, you should be using the integration-test phase.

     

    You could assume that the root of the project directory is two levels up, since you know the test classes will be compiled into target/test-classes by default. Then again, doing things like that is always fragile. Another thought is whether you can get surefire to pass in the Maven basedir (or the absolute path to the generated artifact (jar, war, ear).

     

    Maybe we could instead invert the responsibility.  Instead of ShrinkWrap creating from Maven metadata, some tool could specify something to the test runtime:

     

    That seems to be reasonable. I'm thinking this is a fairly generation problem in Maven, that the test code will need to know about the artifact that the project is building. Could talk to the Maven folks to see if they have ever run into this need.

     

    I do like the idea of being able to reference JARs in the local Maven respository. Also, it would be nice if it picked up the version from the Maven dependency resolver.

     

    I'm not sure we would ever want the whole set of dependencies, because the very problem we are trying to avoid is having the kitchen sink stuffed into are test archives.

  • 6. Re: Arquillian Design Questions:
    Dan Allen Master
    groupId and artifactId is cool. version is something we want to avoid using in the code, if possible. Read it from the Maven pom (or dependency resolver) if at all possible.
  • 7. Re: Arquillian Design Questions:
    Aslak Knutsen Master

    Unless you want to do version tests..

     

    Does my app support all the n versions of lib X.

     

    for( versions ) {

       archive.addLibrary(Maven.artifact("group", "mydepth", version))

       server.deploy(archive)

     

       .. run test

    }

     

  • 8. Re: Arquillian Design Questions:
    Andrew Rubinger Master

    Another way to do this:

     

    Use the Maven dependency plugin to extract the artifact/version you want into project/target/deps, then import the JAR from there.

     

    S,

    ALR

  • 9. Re: Arquillian Design Questions:
    Dan Allen Master
    Ah. Both good points. When we get a prototype working it will be interesting to hear from folks how they will most likely use it.
  • 10. Re: Arquillian Design Questions:
    Jeremy Norris Newbie

    dan.j.allen wrote:

     

    That's an interesting idea and something that should definitely be considered. But let me clarify something about ShrinkWrap's role in Arquillian to hopefully give you a better idea of what we are trying to achieve.

     

    One huge advantage ShrinkWrap brings to Arquillian is classpath control. The classpath of a test run has traditionally been a kitchen sink of all production classes and resources with the test classes and resources layered on top. ShrinkWrap enables you to create micro deployments that can focus on the interaction between a logical grouping of classes. Within that grouping you get the self-assembly of services provided by Java EE; the integration which is being tested.

     

     

    I apologize for the delay.

     

    I agree with your statements above, although I was talking about something a little bit different.  I think everyone is in agreement that ShrinkWrap is in a very good position to help with a) classpath control (ie: "micro-deployments") and b) giving you the packaging abstraction before things have been officially built so they can be handed off to containers.

     

    When I said before that developers usually have a war or jar defined with maven that they want to deploy/test straight from their IDE without duplicating all their project structure in a shrink-wrap builder, I was referring to their current project.  Being able to reference other maven artifacts and dependencies in my test as another way to define these micro-deployments is an interesting idea (and probably a useful one), although that wasn't my main concern.

     

    Rather, I was saying that even though the ability to create micro-deployments (ie: fine-grained classpath control) within your test is very important, in many tests I just want to use the classpath that is already defined (eg: the test-classpath in maven) and then use other mechanisms to select the active components (eg: @Alternative, a mocking framework, or any mechanism that already exists for doing this) at runtime within the container.  I just wanted to make sure that in this case, it was easy to use ShrinkWrap to capture the actual classpath as defined by Maven without writing a ShrinkWrap definition that essentially repeats what is already in Maven.

     

    This conversation is very interesting since it gets to an issue much bigger than the simple ShrinkWrap default builder I'm talking about:

     

    Arquillian, as defined, becomes a competing testing system (ie: a framework, a philosophy and set of best-practices) to all the other ways that are available at a lower level, inside the packaging level (eg: @Alternative, mocking framework configuration, etc.).  Arquillian says that the "better way" is to leave all that stuff out of the deployment units and rather do a more surgical, targeted approach with micro-deployments (ie: classpath control).  This is a very good idea, and may indeed be a better approach.  However, should we still make it possible/easy for people to use their specific lower level testing systems (and take advantage of SkrinkWrap's packaging abstraction)?

     

    Is the above a fair assessment of Arquillian's philosophy?

  • 11. Re: Arquillian Design Questions:
    Dan Allen Master

    jnorris10 wrote:

     

    Arquillian, as defined, becomes a competing testing system (ie: a framework, a philosophy and set of best-practices) to all the other ways that are available at a lower level, inside the packaging level (eg: @Alternative, mocking framework configuration, etc.).  Arquillian says that the "better way" is to leave all that stuff out of the deployment units and rather do a more surgical, targeted approach with micro-deployments (ie: classpath control).  This is a very good idea, and may indeed be a better approach.  However, should we still make it possible/easy for people to use their specific lower level testing systems (and take advantage of SkrinkWrap's packaging abstraction)?

     

    Is the above a fair assessment of Arquillian's philosophy?

     

    I absolutely agree with you that we have several distinct testing needs that Arquillian can seek to fill.

     

    1. Run Arquillian using the project's full test classpath (we can think of it as the Maven test classpath--or the kitchen sink). This comes down to effectively transporting the test classpath into the container and executing the selected tests remotely.
    2. Run an Arquillian test against the project artifact. This would mean publishing the artifact from the main classpath as a library on which the test depends, but getting to select which test classes get included (your @Alternative beans and such).
    3. Take the surgical approach of constructing a custom archive to executing an in-container test with the granularity of your choosing (you select only the classes you want to participate in the test)

     

    I think that without a change to your testing strategy, #2 (the middle road) would work best for you.

     

    The problem with the Maven test classpath is that when the test are run, Maven dumps in the whole test classpath into the kitchen sink. It's fine to have all of the main classpath in there, because those classes have to work together. But you don't assume that all of your test classes are going to play together at once. Typically, when you are working on a test, you only want to introduce certain additional classes because those are the overrides and helpers that serve the test scenario.

  • 12. Re: Arquillian Design Questions:
    Dan Allen Master

    Also, #2 is considerably more valuable than #1 when you are dealing with CDI since each test case is potentially going to need to activate different @Alternative beans (or overrides of any other sort). Which means for each test class you need:

     

    1. a separate bootstrap of CDI (which Arquillian gives you for free) and
    2. a different beans.xml (which you are going to need to do through ShrinkWrap packaging).

     

    You could get away with having the whole test classpath + a beans.xml per test class. But if you are going to do that, you might as well just keep going an only include the test classes you need/want. Regardless of what you choose, we'll make sure Arquillian is flexible enough to go to either extreme.

  • 13. Re: Arquillian Design Questions:
    Jeremy Norris Newbie

    This "full classpath" vs. "micro-deployment" is a very interesting discussion.

     

    The "best" or most productive approach may vary according to the particular testing level you're working at.  By that I mean the following:  In my opinion, there are multiple levels of testing; from a pure unit test all the way up to a pure integration test.  In my opinion, the levels in the middle are very important.  For example, testing an EJB component using the real application sever code with associated declarative services but with mocked resources.  This is obviously not a pure unit test, and it's obviously not a pure integration test.  Some component aggregation is being tested in situ but this component will be aggregated further in a real production environment. Tests in these intermediate levels are very productive IMO because the container's declarative functionality can be tested and automated in a way that would be impractical for a pure integration test.  Connecting to an external container and transporting over the artifacts is a great feature for integration testing at a higher level and perhaps deployment automation scripts, etc. but it's not as valuable in the testing levels I'm talking about.  These levels can also be tested directly from the IDE (and very quickly before the artifact is fully constructed thanks to ShinkWrap).  This also aids tremendously in debugging since declarative services in the container can be stepped through immediately.

     

    Sorry that the above was a bit long, however, I want to give my next comments a frame of reference.  I am talking about the "full classpath" vs. "micro-deployment" question in the context of these intermediate levels (ie: inside the IDE, full embedded container with declarative services, but only subsets of components, mocks may be used on the edges, @Alternative, etc.)

    1. Run Arquillian using the project's full test classpath (we can think of it as the Maven test classpath--or the kitchen sink). This comes down to effectively transporting the test classpath into the container and executing the selected tests remotely.

    Specifically in the case above, transporting into an embedded container.  This is the "full classpath" solution.

    2. Run an Arquillian test against the project artifact. This would mean publishing the artifact from the main classpath as a library on which the test depends, but getting to select which test classes get included (your @Alternative beans and such).

    This options really hurts the testing round-trip time as well as IDE integration.  You can't rely on the IDE's continuous compile feature and just run your test, rather you have to ensure the artifact is constructed and installed somewhere first.  Also, this precludes the use of the debugger's hotswap features which are very useful and save a lot of time.  (I realize there are big caveats here; you can't hotswap a new declarative component that needs to have a lifecycle initialized, etc. but still this feature is often very valuable).

    3. Take the surgical approach of constructing a custom archive to executing an in-container test with the granularity of your choosing (you select only the classes you want to participate in the test).

    I love this feature.  However, in the context of testing level I'm talking about it sounds very tedious to specify the classes required for each deployment, especially when there are lots of dependent classes of the component under test.  In practice, I haven't found it that annoying to test using the entire test classpath, because the test usually also explicitly specifies other implementations or mocks to use on the edges of the test.  Yes, it is annoying at times, but that annoyance has to matched against the annoyance of specifying all the components and class dependencies of a test (and carrying that forward when refactoring).

     

    I understand you want to support all these cases well and that's a good idea.  However, as an aside, I'm really interested in learning what the best and most productive practices end up being - and these best practices may end up being different depending on the testing level.

  • 14. Re: Arquillian Design Questions:
    Jeremy Norris Newbie
    2. Run an Arquillian test against the project artifact. This would mean publishing the artifact from the main classpath as a library on which the test depends, but getting to select which test classes get included (your @Alternative beans and such).

    This options really hurts the testing round-trip time as well as IDE integration.  You can't rely on the IDE's continuous compile feature and just run your test, rather you have to ensure the artifact is constructed and installed somewhere first.  Also, this precludes the use of the debugger's hotswap features which are very useful and save a lot of time.  (I realize there are big caveats here; you can't hotswap a new declarative component that needs to have a lifecycle initialized, etc. but still this feature is often very valuable).

     

    Hold on, option 2 is perfect and does not require the artifact to be physically built and installed somewhere first.  It simply requires ShrinkWrap to automatically include the entire main classpath.  The tester then only has to explicitly list the test classes that are active for that test.  This is a fantastic option for many use-cases.  (In fact, if you use option 1, you are already specifying the classes to be used in setup boilerplate somewhere in the test itself (depending on the framework being used)).

     

    (Sorry, Dan this is perhaps what you were originally intending for option 2.  When you said "test against the project artifact" and "publishing the artifact", I thought you were referring to the build system (eg: maven) physically building the artifact first (a deal-breaker for IDE productivity).  However, you probably were referring to ShrinkWrap ).

1 2 Previous Next