You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DAT files allow for giving an H298, S298, and Cp polynomial terms to construct the Gibbs energy. When there are multiple temperature intervals, each new interval needs to provide a H_transformation and S_transformation as well as the Cp polynomial terms.
To add parameters specified this way to a pycalphad database, we need to integrate the heat capacity parameters to obtain H and S and combine them to compute (symbolically) G = H - TS that is piecewise in temperature.
Currently, the DAT parser can parse the heat capacity intervals, but will raise an error when converting parsed heat capacity intervals to symbolic piecewise intervals (implementing the pycalphad.io.cs_dat.IntervalCP class). There are two issues at play:
SymEngine does not provide symbolic integration. For the DAT-prescribe polynomial terms (CP_TERMS = (S.Zero, S.One, v.T, v.T**2, v.T**(-2))), we should be able to hardcode analytical antiderivatives of the heat capacity temperature terms (particularly where the power rule does not apply, e.g. the S.One term that gives a T**(-1) term in the entropy integral) as they relate to Gibbs energy. The DAT file also allows users to provide their own coefficient-exponent pairs that we won't be able to hardcode integrals for, but the exponents will be a real number so we should be able to apply the power rule on the exponents and "evaluate" the integrals symbolically by plugging in the integration limits to the hardcoded+power-rule terms.
When there are multiple temperature limits, each temperature limit needs to include the integration of heat capacity terms (and inclusion of transformation enthalpy and entropy) from the preceding intervals. The way the code is written right now, the energy contribution for each temperature interval is computed in isolation. For the Cp-type parameters in the DAT file, each interval would have to know about all the preceding intervals to be able to integrate properly from 298 K up until the temperatures in the current interval. Some things may have to be re-designed a little regarding how piecewise Gibbs energy expressions are constructed.
The DAT file below is mentioned in the ChemApp documentation and has heat capacity terms that can be used for testing
DAT files allow for giving an H298, S298, and Cp polynomial terms to construct the Gibbs energy. When there are multiple temperature intervals, each new interval needs to provide a H_transformation and S_transformation as well as the Cp polynomial terms.
To add parameters specified this way to a pycalphad database, we need to integrate the heat capacity parameters to obtain H and S and combine them to compute (symbolically) G = H - TS that is piecewise in temperature.
Currently, the DAT parser can parse the heat capacity intervals, but will raise an error when converting parsed heat capacity intervals to symbolic piecewise intervals (implementing the
pycalphad.io.cs_dat.IntervalCP
class). There are two issues at play:CP_TERMS = (S.Zero, S.One, v.T, v.T**2, v.T**(-2))
), we should be able to hardcode analytical antiderivatives of the heat capacity temperature terms (particularly where the power rule does not apply, e.g. the S.One term that gives a T**(-1) term in the entropy integral) as they relate to Gibbs energy. The DAT file also allows users to provide their own coefficient-exponent pairs that we won't be able to hardcode integrals for, but the exponents will be a real number so we should be able to apply the power rule on the exponents and "evaluate" the integrals symbolically by plugging in the integration limits to the hardcoded+power-rule terms.The DAT file below is mentioned in the ChemApp documentation and has heat capacity terms that can be used for testing
The text was updated successfully, but these errors were encountered: