Contributing to Camel-Kafka-connector
First of all, thank you for having an interest in contributing to Camel-Kafka-connector.
The Apache Camel community is a great group of contributors and Camel-Kafka-connector is the newer subproject in the ecosystem.
There are multiple areas in which camel-kafka-connector could be improved. Here are some examples:
-
Surfing the basic documentation - if something is not clear, let us know or fix it yourself.
-
Try the Getting started guide
-
Digging in the codebase and tune some operations or add new features.
-
Take a look at the open issues and leave a comment on the issue to let us know you are working on it.
Getting in touch
Apache Camel Kafka Connector is an Apache Software Foundation project.
All communication is subject to the ASF Code of Conduct.
There are various ways of communicating with the Camel community.
We can also be reached on the Gitter chat at https://gitter.im/apache/camel-kafka-connector/.
We track issues using the issues tracker in Github
When you’re ready to contribute create a Pull request to the camel-kafka-connector repository
Expect that your Pull request will receive a review and that you will need to respond and correspond to that via comments at GitHub.
Building the project
Basically you could run
mvn clean package
Build the project and run integration tests
To build the project it is sufficient to:
mvn clean install
To run the integration tests it is required to:
-
have Docker version 17.05 or higher running
-
run:
mvn -DskipIntegrationTests=false clean verify package
It is also possible to point the tests to use an external services. To do so, you must set properties for the services that you want to run. This causes the tests to not launch the local container and use existing remote instances. At the moment, the following properties can be set for remote testing:
-
kafka.instance.type
-
kafka.bootstrap.servers
-
-
aws-service.instance.type
-
access.key: AWS access key (mandatory for remote testing)
-
secret.key: AWS secret key (mandatory for remote testing)
-
aws.region: AWS region (optional)
-
aws.host: AWS host (optional)
-
-
aws-service.kinesis.instance.type
-
access.key: AWS access key (mandatory for remote testing)
-
secret.key: AWS secret key (mandatory for remote testing)
-
aws.region: AWS region (optional)
-
aws.host: AWS host (optional)
-
-
elasticsearch.instance.type
-
elasticsearch.host
-
elasticsearch.port
-
-
cassandra.instance.type
-
cassandra.host
-
cassandra.cql3.port
-
-
jms-service.instance.type
-
jms.broker.address
-
For example you can run
mvn -Dkafka.bootstrap.servers=host1:port -Dkafka.instance.type=remote -DskipIntegrationTests=false clean verify package
It’s possible to use a properties file to set these properties. To do so use -Dtest.properties=/path/to/file.properties
.
Running Salesforce integration tests
The first step is to create an account on Salesforce. The service allows developers to create a free account for testing. The account can be created on their login website.
The next step is to create a new connected application. This will provide you with the API keys that allow the automation to run. The Salesforce help contains the documentation and steps related to this part of the process.
After you create the API keys proceed to adjust a few more parameters necessary for the tests to run. Specifically, the IP Relaxation/range policy parameter should be adjusted (usually to the value "Relax IP restrictions").
You need setup change data capture to allow our tests to read from the Account table. The Change Data Capture Developer Guide on the Salesforce documentation is the recommended starting point for this.
Lastly, you need to create the configuration files for the sfdx
CLI client. The CLI client interacts with the account,
creating, updating and deleting records as required for the test execution.
To generate the configuration files, execute the following steps:
-
Run the Salesforce CLI container:
docker run --rm --name salesforce-cli -it -v /path/to/sfdx:/root/.sfdx salesforce/salesforcedx
-
Within the container, use the following command to login:
sfdx force:auth:device:login -s -d -i <client ID>
-
Provide the client secret when request and execute the steps requested by the CLI.
-
Verify that you are logged in correctly using the following command
sfdx force:auth:list
It should present an output like:
#### authenticated orgs ALIAS USERNAME ORG ID INSTANCE URL OAUTH METHOD ───── ──────────────────── ────────────────── ─────────────────────────── ──────────── your-user@email.com SOME NUMERIC ID https://eu31.salesforce.com web
Note: after leaving the container you might need to adjust the permissions of the directory containing the sfdx
configuration files (/path/to/sfdx
).
Using the IDs, credentials and configurations that you created, you need to set the following system properties to run the tests using maven:
-
-Dit.test.salesforce.enable=true
to enable the test -
-Dit.test.salesforce.client.id=<client ID>
with the client ID obtained when you created the API keys -
-Dit.test.salesforce.client.secret=<client secret>
with the client secret obtained when you created the API keys -
-Dit.test.salesforce.password=<user password>
the password of your account -
-Dit.test.salesforce.username=<user name>
the username of your account. -
-Dit.test.salesforce.sfdx.path=/path/to/sfdx
the path to the sfdx configuration (explained further).
Note: the it.test.salesforce.sfdx.path
property should point to the directory containing the sfdx CLI client
configuration.
To run the tests, enable the salesforce
profile so that DTOs are generated and set the aforementioned properties to
the values setup previously.
mvn -U -Psalesforce -DskipIntegrationTests=false -Dit.test.salesforce.sfdx.path=/path/to/sfdx -Dit.test.salesforce.enable=true -Dit.test.salesforce.client.id=<client id> -Dit.test.salesforce.client.secret=<client secret> -Dit.test.salesforce.password=<password> -Dit.test.salesforce.username=<your account> compile test-compile test