Skip to content

Commit

Permalink
Add beliefs with belief timing (#501)
Browse files Browse the repository at this point in the history
Allow reading time series from a file that explicitly includes belief times. Also drop duplicate rows when reading time series from file, with a warning to the user.


* Support importing beliefs from a file with three columns containing event starts, belief times and event values

Signed-off-by: F.N. Claessen <felix@seita.nl>

* Warn for duplicate records and drop them

Signed-off-by: F.N. Claessen <felix@seita.nl>

* flake8

Signed-off-by: F.N. Claessen <felix@seita.nl>

* changelog entry

Signed-off-by: F.N. Claessen <felix@seita.nl>

* Update changelog entry

Signed-off-by: F.N. Claessen <felix@seita.nl>

Signed-off-by: F.N. Claessen <felix@seita.nl>
  • Loading branch information
Flix6x committed Sep 12, 2022
1 parent da74820 commit 074a33d
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 5 deletions.
1 change: 1 addition & 0 deletions documentation/changelog.rst
Expand Up @@ -9,6 +9,7 @@ New features
-------------

* Hit the replay button to replay what happened, available on the sensor and asset pages [see `PR #463 <http://www.github.com/FlexMeasures/flexmeasures/pull/463>`_]
* Improved import of time series data from CSV file: 1) drop duplicate records with warning, and 2) allow configuring which column contains explicit recording times for each data point (use case: import forecasts) [see `PR #501 <http://www.github.com/FlexMeasures/flexmeasures/pull/501>`_]

Bugfixes
-----------
Expand Down
22 changes: 18 additions & 4 deletions flexmeasures/cli/data_add.py
Expand Up @@ -354,6 +354,12 @@ def add_initial_structure():
type=int,
help="Column number with values (1 is 2nd column, the default)",
)
@click.option(
"--beliefcol",
required=False,
type=int,
help="Column number with datetimes",
)
@click.option(
"--delimiter",
required=True,
Expand Down Expand Up @@ -395,6 +401,7 @@ def add_beliefs(
nrows: Optional[int] = None,
datecol: int = 0,
valuecol: int = 1,
beliefcol: Optional[int] = None,
delimiter: str = ",",
decimal: str = ".",
thousands: Optional[str] = None,
Expand All @@ -416,8 +423,8 @@ def add_beliefs(
2020-12-03 14:10,215.6
2020-12-03 14:20,203.8
In case no --horizon is specified, the moment of executing this CLI command is taken
as the time at which the beliefs were recorded.
In case no --horizon is specified and no beliefcol is specified,
the moment of executing this CLI command is taken as the time at which the beliefs were recorded.
"""
sensor = Sensor.query.filter(Sensor.id == sensor_id).one_or_none()
if sensor is None:
Expand All @@ -441,7 +448,7 @@ def add_beliefs(
kwargs["sheet_name"] = sheet_number
if horizon is not None:
kwargs["belief_horizon"] = timedelta(minutes=horizon)
else:
elif beliefcol is None:
kwargs["belief_time"] = server_now().astimezone(pytz.timezone(sensor.timezone))

bdf = tb.read_csv(
Expand All @@ -453,11 +460,18 @@ def add_beliefs(
header=None,
skiprows=skiprows,
nrows=nrows,
usecols=[datecol, valuecol],
usecols=[datecol, valuecol]
if beliefcol is None
else [datecol, beliefcol, valuecol],
parse_dates=True,
na_values=na_values,
**kwargs,
)
duplicate_rows = bdf.index.duplicated(keep="first")
if any(duplicate_rows) > 0:
print("Duplicates found. Dropping duplicates for the following records:")
print(bdf[duplicate_rows])
bdf = bdf[~duplicate_rows]
if unit is not None:
bdf["event_value"] = convert_units(
bdf["event_value"],
Expand Down
2 changes: 1 addition & 1 deletion requirements/app.in
Expand Up @@ -28,7 +28,7 @@ tldextract
pyomo>=5.6
tabulate
timetomodel>=0.7.1
timely-beliefs>=1.11.5
timely-beliefs>=1.12
python-dotenv
# a backport, not needed in Python3.8
importlib_metadata
Expand Down

0 comments on commit 074a33d

Please sign in to comment.