Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
5dd5ddd
Update
ppinchuk Dec 3, 2025
a30e668
Update
ppinchuk Dec 3, 2025
7f603ad
Merge remote-tracking branch 'origin/main' into pp/routing_module
ppinchuk Dec 6, 2025
515b2ac
Update lockfile
ppinchuk Dec 6, 2025
a3044b0
Add new exception type
ppinchuk Dec 6, 2025
b0e4dc3
Add another error type
ppinchuk Dec 6, 2025
60c1fca
Log class names
ppinchuk Dec 6, 2025
c880945
Update tests to check for name
ppinchuk Dec 6, 2025
1601907
Update logic to look for shape dataset
ppinchuk Dec 7, 2025
74bd763
Add first few tests for routing
ppinchuk Dec 7, 2025
6fec6e9
Routing WIP!!
ppinchuk Dec 7, 2025
a3f4c3d
More tests
ppinchuk Dec 7, 2025
953561b
Add test
ppinchuk Dec 7, 2025
9213d5b
Add test
ppinchuk Dec 7, 2025
c1bc645
Fix typo
ppinchuk Dec 7, 2025
f483595
Fix bug
ppinchuk Dec 7, 2025
08b14b3
Add tracked layers test
ppinchuk Dec 7, 2025
5bffe8a
Add geometry test
ppinchuk Dec 7, 2025
7e5df95
A few more tests
ppinchuk Dec 7, 2025
fd86ec3
Add `__init__.py` to module
ppinchuk Dec 9, 2025
97f49fb
Updates
ppinchuk Dec 9, 2025
4e82865
Don't allow negative costs
ppinchuk Dec 9, 2025
9b31040
Add tests for invalid start/end costs
ppinchuk Dec 9, 2025
dce81b3
Check for error message logs
ppinchuk Dec 9, 2025
fe8ee72
Add routing utility tests
ppinchuk Dec 9, 2025
a80c677
Add utilities module
ppinchuk Dec 9, 2025
2438141
Update docstrings
ppinchuk Dec 9, 2025
54ca855
Add first pass of cli file
ppinchuk Dec 10, 2025
57f558e
Minor updates
ppinchuk Dec 10, 2025
d82b8ae
Connect first routing CLI
ppinchuk Dec 10, 2025
fe8f4aa
Add changes to points logic
ppinchuk Dec 10, 2025
fa64388
Minor docstring updates
ppinchuk Dec 10, 2025
2faaa31
Update
ppinchuk Dec 10, 2025
b16bef0
Partially fix tests
ppinchuk Dec 10, 2025
e9fa521
Fix tests
ppinchuk Dec 10, 2025
ed91627
Add docstrings
ppinchuk Dec 10, 2025
2fe36fb
Minor updates
ppinchuk Dec 10, 2025
3117127
Fix docs
ppinchuk Dec 10, 2025
b12b5e6
No extra log
ppinchuk Dec 10, 2025
73dd974
Fix recursion error
ppinchuk Dec 10, 2025
fbec893
Add tests
ppinchuk Dec 10, 2025
7cd8bd4
MVP routing CLI tests
ppinchuk Dec 10, 2025
c78f4f6
Minor refactor
ppinchuk Dec 10, 2025
e614fa6
Minor formatting
ppinchuk Dec 10, 2025
15cd004
update tests
ppinchuk Dec 10, 2025
024e837
Update tests
ppinchuk Dec 10, 2025
3c796a0
Merge remote-tracking branch 'origin/main' into pp/routing_module
ppinchuk Dec 10, 2025
45e5d39
Update lockfile
ppinchuk Dec 10, 2025
8ad80fe
Merge remote-tracking branch 'origin/main' into pp/routing_module
ppinchuk Dec 10, 2025
06dad47
MInor formatting
ppinchuk Dec 10, 2025
b53749e
Update instructions
ppinchuk Dec 10, 2025
d4968ac
Break out dictionary extraction
ppinchuk Dec 10, 2025
eb0fe31
Minor update
ppinchuk Dec 10, 2025
bc2a0fc
Docstrings
ppinchuk Dec 10, 2025
bb49f8d
Add a few tests
ppinchuk Dec 10, 2025
dbaad90
Update
ppinchuk Dec 10, 2025
cd2d559
Minor fix
ppinchuk Dec 10, 2025
21fe9b6
More messages
ppinchuk Dec 10, 2025
891b822
Use `da.max`
ppinchuk Dec 10, 2025
90dc0c3
Fix rust tests
ppinchuk Dec 10, 2025
6d0edd8
Merge remote-tracking branch 'origin/main' into pp/routing_module
ppinchuk Dec 11, 2025
91d72d7
Minor refactor
ppinchuk Dec 11, 2025
0dd2217
Much lower memory limit
ppinchuk Dec 11, 2025
1188678
`add_layer_to_data` now returns `Result` object
ppinchuk Dec 12, 2025
22e2519
More error types
ppinchuk Dec 12, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 40 additions & 13 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,25 +87,44 @@ that is only used for development (tests, linting, docs, etc.).
- Follow `docs/source/dev/README.rst` for style: maintain numpy-style
docstrings, avoid type hints unless pre-existing, and keep module-level
imports in the documented order (`numpy as np`, `xarray as xr`, etc.).
- Respect Ruff defaults (line length 79; docstrings 72). Run
`pixi run -e dev ruff check .` and `pixi run -e dev ruff format .` before
opening a PR.
- Favor descriptive names instead of extra comments; only add comments for
non-obvious behavior (e.g., tricky array ops or concurrency concerns).
- Follow Ruff configuration (79 char lines, 72 char lines for docstrings,
double quotes, numpy docstyle). Run locally: `pixi run -e dev ruff check .`
and `pixi run -e dev ruff format .`
- Do not add comments unless they are absolutely critical to understanding;
always prefer descriptive function and variable names instead.
If some functionality needs further explanation, add this to the
class/function/method docstring under the "Notes" section.
- Use absolute imports under `revrt.`.
- Surface warnings/errors through `revrt.warn` and `revrt.exceptions` (e.g.,
raise `revrtValueError` and emit `revrtWarning`) to ensure logging hooks fire.
- Logging: Use `logging.getLogger(__name__)`. Log at appropriate levels
(DEBUG, INFO, WARNING, ERROR, CRITICAL). Avoid print statements.
- Version: Never edit `_version.py` manually (auto-generated by `setuptools_scm`).
Tag releases `vX.Y.Z`.
- When touching PyO3 bindings, update both Python shims in `revrt/_rust.py` (if
needed) and the Rust exports to keep signatures aligned.

## 6. Docstring Guidelines (Python)
- Use numpy-style docstrings; first line must omit a trailing period.
- Keep docstring lines ≤72 characters and avoid short summaries on
`__init__` methods.
- Document parameters in the method/function docstring, not the class-level
docstring; protected helpers (`_name`) should have single-sentence docs.
- Avoid type hints.
- Keep docstring length to 72 characters per line.
- Never include a period (".") at the end of the first line of docstrings.
- Do not add a short summary to __init__ methods. Instead, keep the line blank
and start the "Parameters" section after a second newline.
- Do not document parameters in the class docstring - do that in the __init__
docstring instead.
- Do not add docstring to dunder methods (e.g., __str__, __repr__, etc.)
unless absolutely necessary.
- All @property and @cached_property method documentation should be one line
long and should start with the return type followed by a colon
(e.g. `"""str: My string property"""`).
- If the default value for a parameter is **not** `None`, document it using
the format: `param_name : type, default=<default value>`. If the default
value for a parameter **is** `None`, use the format : `param_name : type, optional`.
- "Protected" functions and methods (i.e. starting with an underscore)
should always be documented using **only** one-line summary docstrings.
- To exclude functions or classes from the public API documentation, start
the docstring with the token ``[NOT PUBLIC API]``.
- Maintain intersphinx references where possible (see dev guide for mappings).

## 7. Coding Guidelines (Rust)
Expand All @@ -128,16 +147,24 @@ that is only used for development (tests, linting, docs, etc.).
`tests/conftest.py` for shared setup. Keep all test data small (individual files <1MB).
The data doesn't need to fully reproduce realistic analysis cases - it just
needs to include characteristics of a realistic case.
- All python test files (e.g. ``test_scenario.py``) should end with the
following block of code:

.. code-block:: python

if __name__ == "__main__":
pytest.main(["-q", "--show-capture=all", Path(__file__), "-rapP"])

This allows the (single) file to be executed, running only the tests contained
within, which is extremely useful when updating/modifying/adding tests in the file.
- Pytest options for parallel execution (`-n auto`) are supported; prefer
`pixi run -e dev pytest -n auto` for heavier suites.

## 9. Logging, Errors, and Warnings
- Do not log-and-raise manually; custom exceptions/warnings already emit log
records.
- Prefer `revrt.utilities.log_mem` for memory-sensitive workflows to keep log
output consistent.
- CLI commands should rely on the logging configuration provided in
`revrt._cli.configure_logging` to avoid duplicate handlers.
- CLI commands should surface actionable messages and exit codes without
relying on hidden background logging.

## 10. Common Pitfalls & Gotchas
- The PyO3 module must be rebuilt if Rust code changes; run a Pixi task (tests
Expand Down
125 changes: 89 additions & 36 deletions crates/revrt/src/dataset.rs
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ use zarrs::storage::{

use crate::ArrayIndex;
use crate::cost::CostFunction;
use crate::error::Result;
use crate::error::{Error, Result};
pub(crate) use lazy_subset::LazySubset;

/// Manages the features datasets and calculated total cost
Expand Down Expand Up @@ -65,33 +65,48 @@ impl Dataset {

trace!("Creating a new group for the cost dataset");
zarrs::group::GroupBuilder::new()
.build(swap.clone(), "/")
.unwrap()
.store_metadata()
.unwrap();
.build(swap.clone(), "/")?
.store_metadata()?;

let entries = source
.list()
.expect("failed to list variables in source dataset");
let first_entry = entries
let first_entry_opt = entries
.into_iter()
.map(|entry| entry.to_string())
.find(|entry| {
let name = entry.split('/').next().unwrap_or("").to_ascii_lowercase();
// Skip coordinate axes when selecting a representative variable for cost storage.
const EXCLUDES: [&str; 6] =
["latitude", "longitude", "band", "x", "y", "spatial_ref"];
!name.ends_with(".json") && !EXCLUDES.iter().any(|needle| name.contains(needle))
})
.expect("no suitable variables found in source dataset");
let varname = first_entry.split('/').next().unwrap().to_string();
!name.ends_with(".json") && !EXCLUDES.iter().any(|needle| name == *needle)
});
let first_entry = match first_entry_opt {
Some(e) => e,
None => {
return Err(Error::IO(std::io::Error::other(format!(
"no non-coordinate variables found in source dataset: {:?}",
source.list().ok()
))));
}
};

// Skip coordinate axes when selecting a representative variable for cost storage.
let varname = match first_entry.split('/').next() {
Some(name) => name,
None => {
return Err(Error::IO(std::io::Error::other(
"Could not determine any variable names from source dataset",
)));
}
};
debug!("Using '{}' to determine shape of cost data", varname);
let tmp = zarrs::array::Array::open(source.clone(), &format!("/{varname}")).unwrap();
let tmp = zarrs::array::Array::open(source.clone(), &format!("/{varname}"))?;
let chunk_grid = tmp.chunk_grid();
debug!("Chunk grid info: {:?}", &chunk_grid);

add_layer_to_data("cost_invariant", chunk_grid, &swap);
add_layer_to_data("cost", chunk_grid, &swap);
add_layer_to_data("cost_invariant", chunk_grid, &swap)?;
add_layer_to_data("cost", chunk_grid, &swap)?;

let cost_chunk_idx = ndarray::Array2::from_elem(
(
Expand Down Expand Up @@ -268,10 +283,11 @@ impl Dataset {

// Calculate the average with center point (half grid + other half grid).
// Also, apply the diagonal factor for the extra distance.
// Finally, add any invariant costs.
let cost_to_neighbors = neighbors
.iter()
.zip(invariant_neighbors.iter())
.filter(|(((ir, jr), _), _)| !(*ir == i && *jr == j)) // no center point
.filter(|(((ir, jr), v), _)| !(*ir == i && *jr == j) && *v > 0.) // no center point and only positive costs
.map(|(((ir, jr), v), ((inv_ir, inv_jr), inv_cost))| {
debug_assert_eq!((ir, jr), (inv_ir, inv_jr));
((ir, jr), 0.5 * (v + center.1), inv_cost)
Expand All @@ -283,9 +299,8 @@ impl Dataset {
} else {
v
};
((ir, jr), scaled + inv_cost)
(ArrayIndex { i: *ir, j: *jr }, scaled + inv_cost)
})
.map(|((ir, jr), v)| (ArrayIndex { i: *ir, j: *jr }, v))
.collect::<Vec<_>>();

trace!("Neighbors {:?}", cost_to_neighbors);
Expand Down Expand Up @@ -352,29 +367,28 @@ fn add_layer_to_data(
layer_name: &str,
chunk_shape: &ChunkGrid,
swap: &ReadableWritableListableStorage,
) {
) -> Result<()> {
trace!("Creating an empty {} array", layer_name);
let dataset_path = format!("/{layer_name}");
zarrs::array::ArrayBuilder::new_with_chunk_grid(
// cost_shape,
let builder = zarrs::array::ArrayBuilder::new_with_chunk_grid(
chunk_shape.clone(),
zarrs::array::DataType::Float32,
zarrs::array::FillValue::from(zarrs::array::ZARR_NAN_F32),
)
.build(swap.clone(), &dataset_path)
.unwrap()
.store_metadata()
.unwrap();
);

let built = builder.build(swap.clone(), &dataset_path)?;
built.store_metadata()?;

let array = zarrs::array::Array::open(swap.clone(), &dataset_path).unwrap();
let array = zarrs::array::Array::open(swap.clone(), &dataset_path)?;
trace!("'{}' shape: {:?}", layer_name, array.shape().to_vec());
trace!("'{}' chunk shape: {:?}", layer_name, array.chunk_grid());

trace!(
"Dataset contents after '{}' creation: {:?}",
layer_name,
swap.list().unwrap()
swap.list()?
);
Ok(())
}

#[cfg(test)]
Expand All @@ -388,14 +402,19 @@ mod tests {
let path = samples::multi_variable_zarr();
let cost_function =
CostFunction::from_json(r#"{"cost_layers": [{"layer_name": "A"}]}"#).unwrap();
let dataset =
Dataset::open(path, cost_function, 250_000_000).expect("Error opening dataset");
let dataset = Dataset::open(path, cost_function, 1_000).expect("Error opening dataset");

let test_points = [ArrayIndex { i: 3, j: 1 }, ArrayIndex { i: 2, j: 2 }];
let array = zarrs::array::Array::open(dataset.source.clone(), "/A").unwrap();
for point in test_points {
let results = dataset.get_3x3(&point);

// index 0, 0 has a cost of 0 and should therefore be filtered out
assert!(
!results
.iter()
.any(|(ArrayIndex { i, j }, _)| *i == 0 && *j == 0)
);
let ArrayIndex { i: ci, j: cj } = point;
let center_subset = zarrs::array_subset::ArraySubset::new_with_ranges(&[
0..1,
Expand Down Expand Up @@ -471,6 +490,12 @@ mod tests {
for point in test_points {
let results = dataset.get_3x3(&point);

// index 0, 0 has a cost of 0 and should therefore be filtered out
assert!(
!results
.iter()
.any(|(ArrayIndex { i, j }, _)| *i == 0 && *j == 0)
);
let ArrayIndex { i: ci, j: cj } = point;
let center_subset = zarrs::array_subset::ArraySubset::new_with_ranges(&[
0..1,
Expand Down Expand Up @@ -545,13 +570,20 @@ mod tests {

let results = dataset.get_3x3(&ArrayIndex { i: 0, j: 0 });

// index 0, 0 has a cost of 0 and should therefore be filtered out
assert!(
!results
.iter()
.any(|(ArrayIndex { i, j }, _)| *i == 0 && *j == 0)
);

assert_eq!(results, vec![]);
}

#[test_case((0, 0), vec![(0, 1, 0.5), (1, 0, 1.0), (1, 1, 1.5 * SQRT_2)] ; "top left corner")]
#[test_case((0, 1), vec![(0, 0, 0.5), (1, 0, 1.5 * SQRT_2), (1, 1, 2.)] ; "top right corner")]
#[test_case((1, 0), vec![(0, 0, 1.), (0, 1, 1.5 * SQRT_2), (1, 1, 2.5)] ; "bottom left corner")]
#[test_case((1, 1), vec![(0, 0, 1.5 * SQRT_2), (0, 1, 2.), (1, 0, 2.5)] ; "bottom right corner")]
#[test_case((0, 1), vec![(1, 0, 1.5 * SQRT_2), (1, 1, 2.)] ; "top right corner")]
#[test_case((1, 0), vec![(0, 1, 1.5 * SQRT_2), (1, 1, 2.5)] ; "bottom left corner")]
#[test_case((1, 1), vec![(0, 1, 2.), (1, 0, 2.5)] ; "bottom right corner")]
fn test_get_3x3_two_by_two_array((si, sj): (u64, u64), expected_output: Vec<(u64, u64, f32)>) {
let path = samples::cost_as_index_zarr((2, 2), (2, 2));
let cost_function =
Expand All @@ -561,6 +593,13 @@ mod tests {

let results = dataset.get_3x3(&ArrayIndex { i: si, j: sj });

// index 0, 0 has a cost of 0 and should therefore be filtered out
assert!(
!results
.iter()
.any(|(ArrayIndex { i, j }, _)| *i == 0 && *j == 0)
);

assert_eq!(
results,
expected_output
Expand All @@ -571,10 +610,10 @@ mod tests {
}

#[test_case((0, 0), vec![(0, 1, 0.5), (1, 0, 1.5), (1, 1, 2.0 * SQRT_2)] ; "top left corner")]
#[test_case((0, 1), vec![(0, 0, 0.5), (0, 2, 1.5), (1, 0, 2.0 * SQRT_2), (1, 1, 2.5), (1, 2, 3. * SQRT_2)] ; "top middle")]
#[test_case((0, 1), vec![(0, 2, 1.5), (1, 0, 2.0 * SQRT_2), (1, 1, 2.5), (1, 2, 3. * SQRT_2)] ; "top middle")]
#[test_case((0, 2), vec![(0, 1, 1.5), (1, 1, 3.0 * SQRT_2), (1, 2, 3.5)] ; "top right corner")]
#[test_case((1, 0), vec![(0, 0, 1.5), (0, 1, 2.0 * SQRT_2), (1, 1, 3.5), (2, 0, 4.5), (2, 1, 5.0 * SQRT_2)] ; "middle left")]
#[test_case((1, 1), vec![(0, 0, 2.0 * SQRT_2), (0, 1, 2.5), (0, 2, 3.0 * SQRT_2), (1, 0, 3.5), (1, 2, 4.5), (2, 0, 5.0 * SQRT_2), (2, 1, 5.5), (2, 2, 6.0 * SQRT_2)] ; "middle middle")]
#[test_case((1, 0), vec![(0, 1, 2.0 * SQRT_2), (1, 1, 3.5), (2, 0, 4.5), (2, 1, 5.0 * SQRT_2)] ; "middle left")]
#[test_case((1, 1), vec![(0, 1, 2.5), (0, 2, 3.0 * SQRT_2), (1, 0, 3.5), (1, 2, 4.5), (2, 0, 5.0 * SQRT_2), (2, 1, 5.5), (2, 2, 6.0 * SQRT_2)] ; "middle middle")]
#[test_case((1, 2), vec![(0, 1, 3.0 * SQRT_2), (0, 2, 3.5), (1, 1, 4.5), (2, 1, 6.0 * SQRT_2), (2, 2, 6.5)] ; "middle right")]
#[test_case((2, 0), vec![(1, 0, 4.5), (1, 1, 5.0 * SQRT_2), (2, 1, 6.5)] ; "bottom left corner")]
#[test_case((2, 1), vec![(1, 0, 5.0 * SQRT_2), (1, 1, 5.5), (1, 2, 6.0 * SQRT_2), (2, 0, 6.5), (2, 2, 7.5)] ; "bottom middle")]
Expand All @@ -591,6 +630,13 @@ mod tests {

let results = dataset.get_3x3(&ArrayIndex { i: si, j: sj });

// index 0, 0 has a cost of 0 and should therefore be filtered out
assert!(
!results
.iter()
.any(|(ArrayIndex { i, j }, _)| *i == 0 && *j == 0)
);

assert_eq!(
results,
expected_output
Expand All @@ -601,10 +647,10 @@ mod tests {
}

#[test_case((0, 0), vec![(0, 1, 0.5), (1, 0, 2.), (1, 1, 2.5 * SQRT_2)] ; "top left corner")]
#[test_case((0, 1), vec![(0, 0, 0.5), (0, 2, 1.5), (1, 0, 2.5 * SQRT_2), (1, 1, 3.), (1, 2, 3.5 * SQRT_2)] ; "top left edge")]
#[test_case((0, 1), vec![(0, 2, 1.5), (1, 0, 2.5 * SQRT_2), (1, 1, 3.), (1, 2, 3.5 * SQRT_2)] ; "top left edge")]
#[test_case((0, 2), vec![(0, 1, 1.5), (0, 3, 2.5), (1, 1, 3.5 * SQRT_2), (1, 2, 4.), (1, 3, 4.5 * SQRT_2)] ; "top right edge")]
#[test_case((0, 3), vec![(0, 2, 2.5), (1, 2, 4.5 * SQRT_2), (1, 3, 5.)] ; "top right corner")]
#[test_case((1, 0), vec![(0, 0, 2.), (0, 1, 2.5 * SQRT_2), (1, 1, 4.5), (2, 0, 6.), (2, 1, 6.5 * SQRT_2)] ; "left top edge")]
#[test_case((1, 0), vec![(0, 1, 2.5 * SQRT_2), (1, 1, 4.5), (2, 0, 6.), (2, 1, 6.5 * SQRT_2)] ; "left top edge")]
#[test_case((1, 3), vec![(0, 2, 4.5 * SQRT_2), (0, 3, 5.), (1, 2, 6.5), (2, 2, 8.5 * SQRT_2), (2, 3, 9.)] ; "right top edge")]
#[test_case((2, 0), vec![(1, 0, 6.), (1, 1, 6.5 * SQRT_2), (2, 1, 8.5), (3, 0, 10.), (3, 1, 10.5 * SQRT_2)] ; "left bottom edge")]
#[test_case((2, 3), vec![(1, 2, 8.5 * SQRT_2), (1, 3, 9.), (2, 2, 10.5), (3, 2, 12.5 * SQRT_2), (3, 3, 13.)] ; "right bottom edge")]
Expand All @@ -624,6 +670,13 @@ mod tests {

let results = dataset.get_3x3(&ArrayIndex { i: si, j: sj });

// index 0, 0 has a cost of 0 and should therefore be filtered out
assert!(
!results
.iter()
.any(|(ArrayIndex { i, j }, _)| *i == 0 && *j == 0)
);

assert_eq!(
results,
expected_output
Expand Down
9 changes: 9 additions & 0 deletions crates/revrt/src/error.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,15 @@ pub enum Error {
#[error(transparent)]
IO(#[from] std::io::Error),

#[error(transparent)]
ZarrsArrayCreate(#[from] zarrs::array::ArrayCreateError),

#[error(transparent)]
ZarrsGroupCreate(#[from] zarrs::group::GroupCreateError),

#[error(transparent)]
ZarrsStorage(#[from] zarrs::storage::StorageError),

#[allow(dead_code)]
#[error("Undefined error")]
// Used during development while it is not clear a category of error
Expand Down
3 changes: 3 additions & 0 deletions crates/revrt/src/ffi.rs
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,9 @@ impl From<Error> for PyErr {
fn from(err: Error) -> PyErr {
match err {
Error::IO(msg) => PyIOError::new_err(msg),
Error::ZarrsArrayCreate(e) => PyIOError::new_err(e.to_string()),
Error::ZarrsStorage(e) => PyIOError::new_err(e.to_string()),
Error::ZarrsGroupCreate(e) => PyIOError::new_err(e.to_string()),
Error::Undefined(msg) => revrtRustError::new_err(msg),
}
}
Expand Down
1 change: 0 additions & 1 deletion crates/revrt/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -185,7 +185,6 @@ mod tests {
#[test_case((1, 1), vec![(1, 3), (3, 1)], 1.; "horizontal and vertical")]
#[test_case((3, 3), vec![(3, 5), (1, 1), (3, 1)], 1.; "horizontal")]
#[test_case((3, 3), vec![(5, 3), (5, 5), (1, 3)], 1.; "vertical")]
#[test_case((3, 3), vec![(3, 1), (3, 4)], 0.; "zero costs")]
fn routing_one_point_to_many_same_cost_and_length(
(si, sj): (u64, u64),
endpoints: Vec<(u64, u64)>,
Expand Down
2 changes: 1 addition & 1 deletion revrt/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
"""Routing analysis library for the reV model"""
"""Routing analysis library"""

import importlib.metadata

Expand Down
2 changes: 2 additions & 0 deletions revrt/_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
from revrt import __version__
from revrt.spatial_characterization.cli import route_characterizations_command
from revrt.costs.cli import build_masks_command, build_routing_layers_command
from revrt.routing.cli import route_points_command
from revrt.utilities.cli import (
layers_to_file_command,
layers_from_file_command,
Expand All @@ -24,6 +25,7 @@
build_masks_command,
build_routing_layers_command,
route_characterizations_command,
route_points_command,
]
main = make_cli(commands, info={"name": "reVRt", "version": __version__})

Expand Down
Loading
Loading