Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/model_description.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ transition, usually in the context of climate change mitigation.

### Model Scope

<!-- markdown-link-check-disable-next-line -->
MUSE is an [Integrated Assessment Modelling](https://unfccc.int/topics/mitigation/workstreams/response-measures/modelling-tools-to-assess-the-impact-of-the-implementation-of-response-measures/integrated-assessment-models-iams-and-energy-environment-economy-e3-models)
framework that is designed to enable users to create and apply an agent-based model that simulates a
market equilibrium on a set of user-defined commodities, over a user-defined time period, for a
Expand Down
2 changes: 1 addition & 1 deletion src/simulation.rs
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ pub fn run(model: Model, mut assets: AssetPool, output_path: &Path) -> Result<()

// Dispatch optimisation
let solution = perform_dispatch_optimisation(&model, &assets, year)?;
let prices = CommodityPrices::from_model_and_solution(&model, &solution);
let prices = CommodityPrices::from_model_and_solution(&model, &solution, &assets);

// Write result of dispatch optimisation to file
writer.write_flows(year, &assets, solution.iter_commodity_flows_for_assets())?;
Expand Down
101 changes: 76 additions & 25 deletions src/simulation/optimisation.rs
Original file line number Diff line number Diff line change
Expand Up @@ -64,15 +64,19 @@ impl VariableMapKey {
}
}

/// Indicates the commodity ID and time slice selection covered by each commodity constraint
type CommodityConstraintKeys = Vec<(Rc<str>, TimeSliceSelection)>;
/// Indicates the commodity ID and time slice selection covered by each commodity balance constraint
type CommodityBalanceConstraintKeys = Vec<(Rc<str>, TimeSliceSelection)>;

/// Indicates the asset ID and time slice covered by each capacity constraint
type CapacityConstraintKeys = Vec<(AssetID, TimeSliceID)>;

/// The solution to the dispatch optimisation problem
pub struct Solution<'a> {
solution: highs::Solution,
variables: VariableMap,
time_slice_info: &'a TimeSliceInfo,
commodity_constraint_keys: CommodityConstraintKeys,
commodity_balance_constraint_keys: CommodityBalanceConstraintKeys,
capacity_constraint_keys: CapacityConstraintKeys,
}

impl Solution<'_> {
Expand All @@ -94,16 +98,14 @@ impl Solution<'_> {
.map(|(key, flow)| (key.asset_id, &key.commodity_id, &key.time_slice, flow))
}

/// Iterate over the newly calculated commodity prices for each time slice.
///
/// Note that there may only be prices for a subset of the commodities; the rest will need to be
/// calculated in another way.
pub fn iter_commodity_prices(&self) -> impl Iterator<Item = (&Rc<str>, &TimeSliceID, f64)> {
// We can get the prices by looking at the dual row values for commodity balance
// constraints. Each commodity balance constraint applies to a particular time slice
// selection (depending on time slice level), but we want to return prices for each time
// slice, so if the selection covers multiple time slices we return the same price for each.
self.commodity_constraint_keys
/// Keys and dual values for commodity balance constraints.
pub fn iter_commodity_balance_duals(
&self,
) -> impl Iterator<Item = (&Rc<str>, &TimeSliceID, f64)> {
// Each commodity balance constraint applies to a particular time slice
// selection (depending on time slice level). Where this covers multiple timeslices,
// we return the same dual for each individual timeslice.
self.commodity_balance_constraint_keys
.iter()
.zip(self.solution.dual_rows())
.flat_map(|((commodity_id, ts_selection), price)| {
Expand All @@ -112,6 +114,18 @@ impl Solution<'_> {
.map(move |(ts, _)| (commodity_id, ts, *price))
})
}

/// Keys and dual values for capacity constraints.
pub fn iter_capacity_duals(&self) -> impl Iterator<Item = (AssetID, &TimeSliceID, f64)> {
self.capacity_constraint_keys
.iter()
.zip(
self.solution.dual_rows()[self.commodity_balance_constraint_keys.len()..]
.iter()
.copied(),
)
.map(|((asset_id, time_slice), dual)| (*asset_id, time_slice, dual))
}
}

/// Perform the dispatch optimisation.
Expand Down Expand Up @@ -139,8 +153,8 @@ pub fn perform_dispatch_optimisation<'a>(
let variables = add_variables(&mut problem, model, assets, year);

// Add constraints
let commodity_constraint_keys =
add_asset_contraints(&mut problem, &variables, model, assets, year);
let (commodity_balance_constraint_keys, capacity_constraint_keys) =
add_asset_constraints(&mut problem, &variables, model, assets, year);

// Solve problem
let mut highs_model = problem.optimise(Sense::Minimise);
Expand All @@ -160,7 +174,8 @@ pub fn perform_dispatch_optimisation<'a>(
solution: solution.get_solution(),
variables,
time_slice_info: &model.time_slice_info,
commodity_constraint_keys,
commodity_balance_constraint_keys,
capacity_constraint_keys,
}),
status => Err(anyhow!("Could not solve: {status:?}")),
}
Expand Down Expand Up @@ -264,26 +279,49 @@ fn calculate_cost_coefficient(
}

/// Add asset-level constraints
fn add_asset_contraints(
///
/// Note: the ordering of constraints is important, as the dual values of the constraints must later
/// be retrieved to calculate commodity prices.
///
/// # Arguments:
///
/// * `problem` - The optimisation problem
/// * `variables` - The variables in the problem
/// * `model` - The model
/// * `assets` - The asset pool
/// * `year` - Current milestone year
///
/// # Returns:
///
/// * A vector of keys for commodity balance constraints
/// * A vector of keys for capacity constraints
fn add_asset_constraints(
problem: &mut Problem,
variables: &VariableMap,
model: &Model,
assets: &AssetPool,
year: u32,
) -> CommodityConstraintKeys {
let commodity_constraint_keys =
) -> (CommodityBalanceConstraintKeys, CapacityConstraintKeys) {
let commodity_balance_constraint_keys =
add_commodity_balance_constraints(problem, variables, model, assets, year);

let capacity_constraint_keys = add_asset_capacity_constraints(
problem,
variables,
assets,
&model.time_slice_info,
&commodity_balance_constraint_keys,
);

// **TODO**: Currently it's safe to assume all process flows are non-flexible, as we enforce
// this when reading data in. Once we've added support for flexible process flows, we will
// need to add different constraints for assets with flexible and non-flexible flows.
//
// See: https://github.com/EnergySystemsModellingLab/MUSE_2.0/issues/360
add_fixed_asset_constraints(problem, variables, assets, &model.time_slice_info);

add_asset_capacity_constraints(problem, variables, assets, &model.time_slice_info);

commodity_constraint_keys
// Return constraint keys which are required to calculate commodity prices
(commodity_balance_constraint_keys, capacity_constraint_keys)
}

/// Add asset-level input-output commodity balances.
Expand All @@ -299,7 +337,7 @@ fn add_commodity_balance_constraints(
model: &Model,
assets: &AssetPool,
year: u32,
) -> CommodityConstraintKeys {
) -> CommodityBalanceConstraintKeys {
// Sanity check: we rely on the first n values of the dual row values corresponding to the
// commodity constraints, so these must be the first rows
assert!(
Expand All @@ -308,7 +346,7 @@ fn add_commodity_balance_constraints(
);

let mut terms = Vec::new();
let mut keys = CommodityConstraintKeys::new();
let mut keys = CommodityBalanceConstraintKeys::new();
for commodity in model.commodities.values() {
if commodity.kind != CommodityType::SupplyEqualsDemand
&& commodity.kind != CommodityType::ServiceDemand
Expand Down Expand Up @@ -418,8 +456,17 @@ fn add_asset_capacity_constraints(
variables: &VariableMap,
assets: &AssetPool,
time_slice_info: &TimeSliceInfo,
) {
commodity_balance_constraint_keys: &CommodityBalanceConstraintKeys,
) -> CapacityConstraintKeys {
// Sanity check: we rely on the dual rows corresponding to the capacity constraints being
// immediately after the commodity balance constraints in `problem`.
assert!(
problem.num_rows() == commodity_balance_constraint_keys.len(),
"Capacity constraints must be added immediately after commodity constraints."
);

let mut terms = Vec::new();
let mut keys = CapacityConstraintKeys::new();
for asset in assets.iter() {
for time_slice in time_slice_info.iter_ids() {
let mut is_input = false; // NB: there will be at least one PAC
Expand All @@ -438,8 +485,12 @@ fn add_asset_capacity_constraints(
}

problem.add_row(limits, terms.drain(0..));

// Keep track of the order in which constraints were added
keys.push((asset.id, time_slice.clone()));
}
}
keys
}

#[cfg(test)]
Expand Down
44 changes: 39 additions & 5 deletions src/simulation/prices.rs
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
//! Code for updating the simulation state.
use super::optimisation::Solution;
use crate::agent::AssetPool;
use crate::model::Model;
use crate::time_slice::{TimeSliceID, TimeSliceInfo};
use indexmap::IndexMap;
use log::warn;
use std::collections::HashSet;
use std::collections::{HashMap, HashSet};
use std::rc::Rc;

/// A combination of commodity ID and time slice
Expand All @@ -18,9 +19,9 @@ impl CommodityPrices {
/// Calculate commodity prices based on the result of the dispatch optimisation.
///
/// Missing prices will be calculated directly from the input data
pub fn from_model_and_solution(model: &Model, solution: &Solution) -> Self {
pub fn from_model_and_solution(model: &Model, solution: &Solution, assets: &AssetPool) -> Self {
let mut prices = CommodityPrices::default();
let commodities_updated = prices.add_from_solution(solution);
let commodities_updated = prices.add_from_solution(solution, assets);

// Find commodities not updated in last step
let remaining_commodities = model
Expand All @@ -34,17 +35,50 @@ impl CommodityPrices {

/// Add commodity prices for which there are values in the solution
///
/// Commodity prices are calculated as the sum of the commodity balance duals and the highest
/// capacity dual for each commodity/timeslice.
///
/// # Arguments
///
/// * `solution` - The solution to the dispatch optimisation
/// * `assets` - The asset pool
///
/// # Returns
///
/// The set of commodities for which prices were added.
fn add_from_solution(&mut self, solution: &Solution) -> HashSet<Rc<str>> {
fn add_from_solution(&mut self, solution: &Solution, assets: &AssetPool) -> HashSet<Rc<str>> {
let mut commodities_updated = HashSet::new();

for (commodity_id, time_slice, price) in solution.iter_commodity_prices() {
// Calculate highest capacity dual for each commodity/timeslice
let mut highest_duals = HashMap::new();
for (asset_id, time_slice, dual) in solution.iter_capacity_duals() {
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alexdewar Thanks for the comments! The only other thing I'm unsure about is this loop, as it's repeating a lot of asset-level computations for every timeslice (getting the asset, getting the pacs, checking if the flows are positive). I think better would be to have nested loops, but that would require re-designing iter_capacity_duals, and maybe it's not even worth it for the extra work / trade-off with readability. What do you think?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point... I don't think there's a way around that without making the data structures more complicated though, so I'd leave it as is for now. We can always optimise this stuff further down the line if benchmarking shows it's a bottleneck.

One thing we could do to avoid having to look up the assets multiple times (though I don't think we should do it just yet) is to switch from saving AssetIDs to Rc<Asset>s for the constraint "keys". We'd also have to change AssetPool to store Rc<Asset>s etc. If we did this, we'd have to be careful that code wasn't making use of assets after they'd been decommissioned though (wouldn't be a problem for this bit of code). Could use Weaks for that, but that would make things a bit more fiddly.

We could also improve performance for finding PACs. The way we store commodity flows is not particularly optimal in general, so I've opened an issue for it (#449). (Who knows how much impact fixing that would actually have on overall program performance, but it's not an especially big change so we should probably do it at some point.)

let asset = assets.get(asset_id).unwrap();

// Iterate over process pacs
let process_pacs = asset.process.iter_pacs();
for pac in process_pacs {
let commodity = &pac.commodity;

// If the commodity flow is positive (produced PAC)
if pac.flow > 0.0 {
let key: CommodityPriceKey = (commodity.id.clone(), time_slice.clone());
// Update the highest dual for this commodity/timeslice
highest_duals
.entry(key)
.and_modify(|current_dual| {
if dual > *current_dual {
*current_dual = dual;
}
})
.or_insert(dual);
}
}
}

// Add the highest capacity dual for each commodity/timeslice to each commodity balance dual
for (commodity_id, time_slice, dual) in solution.iter_commodity_balance_duals() {
let key = (Rc::clone(commodity_id), time_slice.clone());
let price = dual + highest_duals.get(&key).unwrap_or(&0.0);
self.insert(commodity_id, time_slice, price);
commodities_updated.insert(Rc::clone(commodity_id));
}
Comment thread
tsmbland marked this conversation as resolved.
Expand Down