OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

sca-assembly message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]


Subject: Improving conformance testing modularity


Hi Mike et al,

 From last week's email thread, I promised to take a look at the  
conformance tests and provide feedback as to how they can potentially  
be refactored to accommodate specification "modularity," i.e. the  
ability for vendors to configure the test harness to be run against  
proprietary implementation languages. Here's a start of that feedback.  
I'm happy to supply code-level examples based on responses to some of  
my questions and proposals below.

I'll start with an initial set of questions and comments and then move  
on to some proposals for how modularity might be structured.

Here are the initial questions:

1. How are test artifacts built? What tools are required (e.g. Ant,  
Maven)? From the file layout, it appears Maven is being used. I think  
the build artifacts should be checked into the svn repository.

2. Will contribution artifacts be bundled in zip archives only? I  
assume this has to be the case since SCA requires only support for  
that format.

3. Why are import.java statements in ASM_ contributions? I'm don't  
understanding why they are needed even when Java is used as the  
implementation language. Actually, I think they should be removed to  
verify import.java/export.java work correctly as a contribution using  
a composite in another contribution should not have to import Java  
artifacts referenced by that composite as they are an implementation  
detail.

4. The purpose of General, Contribution1 and Contribution2 is not  
clear to me. I think they are intended to be different contributions  
but I am not sure (it would be helpful to have it explained). If they  
are intended to be separate contributions and Maven is the build  
system, then my recommendation would be to create separate modules for  
them as they zip files will need to be produced. Otherwise, Maven  
won't behave nicely.

5. In the General contribution, the namespace http://docs.oasis-open.org/ns/opencsa/scatests/200903 
  is imported and exported. This type of behavior is defined in OSGi  
but is it defined in Assembly? I don't remember seeing it and if it is  
not defined, it probably should be removed.

6. Java package names should follow JLS recommendations instead of,  
for example, "test"

7. Given the following interface:

public interface RuntimeBridge {

	/**
	 * Sets the Test configuration for a particular test run
	 * @param testConfiguration
	 */
	public void setTestConfiguration(TestConfiguration testConfiguration);

	/**
	 * Start the contribution(s) which represent the application under  
test in the SCA runtime
	 * @param contributionURIs - an array of contribution URIs as strings
	 * @return true if the contributions were loaded and started in the  
SCA runtime, false otherwise
	 */
	boolean startContribution(String[] contributionURIs) throws Exception;

	/**
	 * Stop the contribution(s)
	 */
	void stopContribution();

	String getContributionLocation(Class<?> testClass);
	
} // end interface RuntimeBridge


a. What does getContributionLocation(Class<?> testClass) do? I believe  
it returns a URL for the contribution archive assuming the "testClass"  
is in it. If this is the case, why are the test classes bundled in the  
contribution archive? My assumption is the following:  the contents of  
a contribution archive should not be visible to the test client  
classloader. The test harness should also ensure that the contents of  
the contributions are visible to the classloader used to boot the SCA  
runtime. Otherwise, the the Java import/export mechanisms may not  
function properly and cannot be verified. Regardless of this, and as a  
general design point, the method seems generic and should not be part  
of the runtime-specific SPI as it is not determined by the runtime  
implementation.

b. What is the array contributionURIs in startContribution(String[]  
contributionURIs)? Are they URLs, i.e. intended to be deferenceable?   
If so, I think the type should be URL[] or (better) List<URL>.

c. How is a runtime notified of which deployable composites to deploy  
in a particular contribution? Related to that, I noticed only one  
composite is defined as a deployable (test:TEST_ASM_12001). That seems  
problematic.

8. Given the following interface:

public class ASM_4001_TestCase extends BaseJAXWSTestCase {


     protected TestConfiguration getTestConfiguration() {
     	TestConfiguration config = new TestConfiguration()
     	config.testName 		= this.getClass().getSimpleName().substring(0,  
8);
     	config.input 			= "request";
     	config.output[0] 			= config.testName + " " + config.input + "  
service1 operation1 invoked";
     	config.composite 		= "Test_" + config.testName + ".composite";
     	config.testServiceName 	= "TestClient";
     	config.testClass 		= ASM_0002_Client.class;
     	config.contributionNames	= new String[] { "General", "General" +  
_Lang };
     	config.serviceInterface = TestInvocation.class;
     	return config;
     }

}

a. What is "General"+_Lang  in "config.contributionNames	= new  
String[] { "General", "General" + _Lang };" used for? Can't the URLs  
of language specific  contributions be explicitly supplied by the  
vendor?

----------------------------------------------------------

Here is the start of a proposal for introducing more modularity:

One approach to modularizing the conformance tests would be assume the  
build system is Maven for assembly (an alternative can be chosen which  
impose different constraints on how modularity is achieved and a  
choice would have to be made). Languages other than Java and BPEL may  
choose to have their own build systems.

Assuming Maven, the conformance tests should form several multi-module  
project. Something like:

Assembly Project
	|
	|------------------- Assertions Module:
	|				- Forms a single SCA contribution
	|				- Imports a language-specific namespace
	|				- Contains composites against which test assertions are run.  
These composites define components whose implementations are  
composites in the language-specific namespace
	|------------------- Assembly Test Client Module
	|				- Contains Test clients (JAX-WS) which perform and verify test  
assertions
	|				- Uses the RuntimeBridge SPI to deploy the assertions module  
contribution and language specific contributions

Java Project
	|
	|------------------- Java Implementation Module
	|
	(other modules for testing things like import.java/export.java)

BPEL Project, etc.

One advantage of separating these projects is that they can be  
released independently. For example, the language X can be released  
after the Assembly project. I don't think this is far off from the  
current structure of the SCA composites, although it would involve  
restructuring the conformance test layout.

In terms of the SPI, I was thinking the RuntimeBridge SPI will need to  
be refactored for a number of reasons. Probably the most important is  
there will need to be a mechanism for testing deployment semantics  
(e.g. include-in-domain, remove, etc.). Another reason is the current  
SPI does not really fit some runtimes. For example, Fabric3 has no  
notion of start/stop contribution. It does, however, provide  
facilities for installing/uninstalling contributions in a domain and  
deploying/undeploying composites. Further, Fabric3 conformance tests  
will likely be run against our standard software distribution, which  
is a separate server process that can be managed through JMX  
(customers will want the standard distribution verified for  
conformance).

Based on my previous comments, I was thinking of an SPI similar to the  
following:

public interface RuntimeBridge {

         void setTestConfiguration(TestConfiguration testConfiguration);

	DeploymentResult deployUsingIncludeInDomain(URL contributionUrl,  
Set<QName> deployables);

	DeploymentResult deployUsingAddDomain(URL contributionUrl, Set<QName>  
deployables);

	DeploymentResult removeFromDomain(Set<QName> deployables);   // note  
I'm not quite sure about the parameters to this operation but  
deployables seems reasonable

}

As a final comment, we probably also need to consider allowing the  
test client harness to be something other than Java. I realize this is  
potentially a can of worms, but consider the following situation. A  
vendor/open source project wants to implement an SCA runtime in some  
language other than Java. The management interface for the runtime is  
built using a technology that is not accessible (or easily accessible)  
from Java. I don't have an answer for how to solve this problem yet.  
If wire-level data were tested, I don't know how deployment errors can  
be verified. Maybe the answer is to require a vendor who does not want  
or cannot use Java to supply a port of the test harness. Then the  
issue is verifying the test harness is faithfully reproduced in the  
proprietary language.

Jim


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [List Home]