Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Demonstrator calibration #186

Open
wants to merge 37 commits into
base: main
Choose a base branch
from
Open

Demonstrator calibration #186

wants to merge 37 commits into from

Conversation

atulag0711
Copy link
Collaborator

Just adding the dummy scripts with distribution and inputs of the snakemake to read them. Not in their final form, but to help testing the updated design problem. I will update how the calibration generates the calibrated results and how the optimization reads them subsequently.

eriktamsen and others added 30 commits October 17, 2022 13:55
- Stochastic optimization with stochastic constraints implementation
- Tested it for "column simulation code". Graphs and other detailts in the shared document
- ALso implemented a method to perform stochastic optimisation when design variables directly appear in the objective and it is not differntiable. (Variational Objective VO.py)
…ion part. Note that it is not modular completely. Just for ref.
…onSolver' into Calibration_Optimisation_HydrationSolver
# Conflicts:
#	environment.yml
#	lebedigital/simulation/precast_column.py
#	tests/demonstrator_scripts/test_column_simulation.py
# Conflicts:
#	lebedigital/demonstrator_scripts/beam_design.py
#	lebedigital/demonstrator_scripts/kpi_from_fem.py
#	tests/demonstrator_scripts/test_beam_design.py
#	usecases/optimization_paper/analyze_kpis/analyze_kpis.py
#	usecases/optimization_paper/optimization_workflow/Inputs/aggregates_volume_fraction.json
#	usecases/optimization_paper/optimization_workflow/Inputs/geometry.json
#	usecases/optimization_paper/optimization_workflow/Inputs/loads.json
#	usecases/optimization_paper/optimization_workflow/Inputs/steel_yield.json
# Conflicts:
#	lebedigital/demonstrator_scripts/dummy_hydration_parameters.py
#	lebedigital/demonstrator_scripts/dummy_paste_strength_stiffness.py
#	usecases/optimization_paper/optimization_workflow/Inputs/material_properties.json
@joergfunger
Copy link
Member

Some of the files (in particular the yaml files in the minimum working example) should not be added.

Copy link
Member

@eriktamsen eriktamsen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the PR.

  1. A small comment for the future, please make sure, before creating a branch to open a new issue and link the branch to it, this helps to keep an overview of the repo.
  2. There are a lot of scripts where I lost the overview what is happening, but I guess that is okay. However, I think the structure does not follow our current structure. We keep scripts in lebedigital and then call them in usecases. Currently we are working on the optimization_paper, so I would not keep things in "usecases/demonstrator".
  3. as @joergfunger mentioned please do not include the results of the data extraction in the commit, as well as pngs that are generated.
  4. Please make sure your scripts are tested individually
  5. please add the calibration step to the workflow, or make sure in some way that this can be run and tested.


# 1-$1, the $1 is for the first argument of the sbatch run_jobs 100, where 100 samples total
# Load Python Module
source /home/atul/.bashrc
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there any way to generalize this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be good to rename the file and function, as its not a placeholder anymore, right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have to be honest, I dont remember what this function was used for. Do we still need it?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test fails locally.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a result of the metadata extraction, should not be committed (see Jörgs comment)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it would be good if these can be created from the workflow, not committed in the repo

# - https://github.com/artix41/AVO-pytorch/blob/master/avo-poisson.ipynb
# observartion : 9.11.2022 : This code is working as it should. Accurately recreating the Fig 2 of the paper.
import sys
sys.path.extend(['/home/atul/PhD_Tasks/LeBeDigital/ModelCalibration']) # temp fix to add the project path
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure if this is a general solution

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

similar question as before, I dont remember anymore what we used the "column" for. is this part of the calibration?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is this pdf?

@@ -362,13 +368,15 @@ rule approx_hydration_parameters:
from lebedigital.demonstrator_scripts.dummy_hydration_parameters import dummy_hydration_parameters
#merging contents of both dictionaries and individual variable inputs

# had to bypass this pint thing as it was not letting me use numpy.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sounds dangerous and is a great source for errors.
Can we move the "removing of pint units" to the script? There we can first make sure the right units are applied and then remove the units. Otherwise this will lead to problems. By using pint_object.magnitude, you can access the value of the object.

@eriktamsen
Copy link
Member

I merged this branch with main, as Cesary fixed a problem unrelated to the PR.
There are still two existing tests that are not passin, additional tests don't seem to have been implemented.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants