Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SensorThingsImporter build problem #1

Open
chenfenggiser opened this issue Sep 22, 2017 · 3 comments
Open

SensorThingsImporter build problem #1

chenfenggiser opened this issue Sep 22, 2017 · 3 comments

Comments

@chenfenggiser
Copy link

I want to use the SensorThingsImporter to import csv file into my Senserthingserver database. But when I build the source codes, there is one error makes it can't be build successful.
The error is: Failure to find de.fraunhofer.iosb.ilt:Configurable:jar:0.5 in https://jcenter.bintray.com was cached in the local repository.
Does anyone can help me?

@hylkevds
Copy link
Member

Yes, the Configurable project is not available through bintray yet, that's still on my todo list. You can just clone the project from https://github.com/hylkevds/Configurable and give the mvn install command on it. That should make it available for local maven builds.

git clone https://github.com/hylkevds/Configurable.git
cd Configurable
mvn install

@chenfenggiser
Copy link
Author

Hey, hylkevds, thanks for your answer. Now the build problem has been solved, and the SensorThingsImporter can be build successful.
I builded it and it can be opened successful. Now I want to import my csv files into my Senserthingserver database. But when I open the Importer and setting a template, there are something make me feel confused. My csv data only has 4 columns (Phenomenon time, temperature value, longitude, latitude), it is hard for me to setting the template by the importer, I can't find where can I set the position, and when I load my csv file into it, there are a lot of errors appear on the command box.

I think I made some mistakes when I setting my template, do you have any documents or example data to explain the Importer?
Thanks a million.

@hylkevds
Copy link
Member

The importer is very much a work-in-progress project, so there is no documentation yet. It also has no support for changing position yet, but you should be able to add that.

In your case, you would select the "ImporterCsv" as importer.

  • The "InputUrl" would point to your csv file.
  • Optionally you might have to set things like the characterset and delimiter, in case it is not a proper CSV file.
  • For testing you should set the Row Limit to simething small like 5, so it stops after the first 5 rows in the file.
  • If your file has a header row, you would set the rowskip to 1, so it skips the first line of the file.

You then add one or more converters. Each converter generates 1 observation per line in the file. So in your case you would need 1 converter.

  • You have 1 result (no MultiDatastream) so you add 1 item in "result columns" and set this to 1 (columns start at 0)
  • Your phenomenonTime is in column 0
  • If your time format is not ISO 8601 with timeZone (2017-09-25T12:17:04+00:00) you will have to add a Time Parser, and set the Format to your time format. If your times do not have timezone information, you also have to set the "Zone" field in the Time Parser.
  • I see the Result Parser is not used yet...
  • Lastly you have to tell the Converter how to set the Datastream for the generated Observations. Here you have two options:
    1. DsMapperFixed: Sets a fixed Datastream id
    2. DsMapperFilter: Queries the target service for a Datastream, using as a filter what is in the text box. You can use {x} placeholders to substitute column value x in this query string. So the example Thing/properties/id eq {1} would execute the request v1.0/Datastreams?$filter=Thing/properties/id eq <value of the second column>

The Uploader is where you specify the SensorThings API service to upload the observations to.

  • Optionally you can set an Authentication method currently only Basic auth is supported.

Finally you can add a Validator that will validate each generated observation before deciding to upload it. The simplest is "ValidatorNewer" that just checks if the generated observation has a phenomenonTime that is after the latest observation in the service.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants