Conversation
This commit implements comprehensive serialization and deserialization of NetworkReductionData within PTDF HDF5 files, addressing the issue where network reduction maps were lost during serialization, rendering the PowerNetworkMatrices less useful. Changes: - Extended to_hdf5() to serialize NetworkReductionData fields - Extended from_hdf5() to deserialize NetworkReductionData - Added JSON3 import for handling complex nested data structures - Implemented helper functions for serializing: * Simple sets and dictionaries (bus maps, removed buses/arcs) * Complex dictionaries with tuple keys and complex values * Bus reduction maps and reverse search maps * Added branch and admittance maps * Branch metadata (direct, parallel, series, transformer3W) * ReductionContainer with all reduction types Implementation Details: - Uses HDF5 groups to organize network reduction data hierarchically - Employs JSON3 for serializing complex nested structures - Preserves network topology information (bus/arc mappings) - Stores metadata about PSY objects (names, types) where full object serialization is not feasible - Maintains backward compatibility with existing HDF5 files - Empty NetworkReductionData is handled efficiently Limitations: - PSY object references (ACTransmission, ThreeWindingTransformer) are not fully serialized, only their metadata - Reverse maps (reverse_direct_branch_map, etc.) are preserved as empty since they contain PSY objects as keys - Derived fields (all_branch_maps_by_type, name_to_arc_map) are not serialized and should be repopulated if needed Testing: - Added comprehensive test suite in test_network_reduction_serialization.jl - Tests cover RadialReduction, DegreeTwoReduction, and combined reductions - Verifies topology preservation, compression compatibility, and backward compatibility Fixes issue where PowerNetworkMatrices became less useful after deserialization due to lost network reduction information.
…tation - Changed incorrect system name from 'c_sys5_re' to 'c_sys5' - Fixed network_reductions parameter to use NetworkReduction[...] type annotation instead of plain [...] arrays, matching the pattern used in other tests - This ensures the reduction types are correctly typed as NetworkReduction abstract type elements, which is required by the PTDF/Ybus constructors
|
@copilot review the code errors in the getter function that is failing in the tests and make a correction to the tests. Check the logs from the GitHub actions. |
…cess Co-authored-by: jd-lara <16385323+jd-lara@users.noreply.github.com>
Fix getter function usage in network reduction serialization tests
- Replace attribute-based check with direct group existence check - Use haskey(file, 'network_reduction_data') instead of reading attributes - Remove unnecessary attribute writes for empty NetworkReductionData - This fixes deserialization of older HDF5 files without network reduction data - Improves robustness when handling files created before this feature
| # Serialize branch maps (these contain PSY objects, so we store metadata) | ||
| _serialize_direct_branch_map(nrd_group, nrd.direct_branch_map) | ||
| _serialize_parallel_branch_map(nrd_group, nrd.parallel_branch_map) | ||
| _serialize_series_branch_map(nrd_group, nrd.series_branch_map, compress, compression_level) |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| _serialize_series_branch_map(nrd_group, nrd.series_branch_map, compress, compression_level) | |
| _serialize_series_branch_map( | |
| nrd_group, | |
| nrd.series_branch_map, | |
| compress, | |
| compression_level, | |
| ) |
| transformer3W_map = _deserialize_transformer3W_map(nrd_group) | ||
|
|
||
| # Deserialize direct_branch_name_map | ||
| direct_branch_name_map = _deserialize_dict_string_tuple(nrd_group, "direct_branch_name_map") |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| direct_branch_name_map = _deserialize_dict_string_tuple(nrd_group, "direct_branch_name_map") | |
| direct_branch_name_map = | |
| _deserialize_dict_string_tuple(nrd_group, "direct_branch_name_map") |
| # Create NetworkReductionData with all fields | ||
| # Note: reverse maps and derived fields are intentionally left empty | ||
| # as they would contain PSY objects which cannot be serialized | ||
| return NetworkReductionData( |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| return NetworkReductionData( | |
| return NetworkReductionData(; |
- Add explicit collect() calls to convert JSON3.Array to Vector types - Use collect(Int, ...) for type-safe array conversion - Add explicit type conversions: String(), Int(), Bool() - Fixes type incompatibility errors when deserializing from HDF5 - JSON3.read() returns JSON3-specific types that need conversion to base Julia types
Replace all get(HDF5.attributes(...), key, default) calls with haskey(HDF5.attributes(...), key). The get() function doesn't work correctly with HDF5.attributes objects - it needs haskey() instead. This fixes the CI/CD test failures where attribute checks were failing during deserialization. Changes: - _deserialize_set: Use haskey instead of get - _deserialize_set_of_tuples: Use haskey instead of get - _deserialize_bus_reduction_map: Use haskey instead of get - _deserialize_dict_int_int: Use haskey instead of get - _deserialize_dict_tuple_complex: Use haskey instead of get - _deserialize_dict_int_complex: Use haskey instead of get - _deserialize_dict_string_tuple: Use haskey instead of get - _deserialize_direct_branch_map: Use haskey instead of get - _deserialize_parallel_branch_map: Use haskey instead of get - _deserialize_series_branch_map: Use haskey instead of get - _deserialize_transformer3W_map: Use haskey instead of get
The issue was that haskey() and get() don't work reliably with HDF5.attributes() objects. Solution: - For datasets: Check with haskey(group, dataset_name) which works correctly - For attributes: Use try-catch blocks when reading attributes since haskey doesn't work with HDF5.Attributes Changes: - Removed haskey() checks on HDF5.attributes() objects - Use try-catch for attributes stored as JSON (bus_reduction_map, direct_branch_name_map, reduction_container) - Use haskey(group, dataset_name) for actual HDF5 datasets - Removed redundant '_empty' attribute checks - just check if dataset exists This follows the pattern used elsewhere in HDF5.jl where attributes are read with try-catch.
| return Set{Tuple{Int, Int}}() | ||
| end | ||
| tuples_array = read(group[name]) | ||
| return Set{Tuple{Int, Int}}([(tuples_array[1, i], tuples_array[2, i]) for i in 1:size(tuples_array, 2)]) |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| return Set{Tuple{Int, Int}}([(tuples_array[1, i], tuples_array[2, i]) for i in 1:size(tuples_array, 2)]) | |
| return Set{Tuple{Int, Int}}([ | |
| (tuples_array[1, i], tuples_array[2, i]) for i in 1:size(tuples_array, 2) | |
| ]) |
| branch_info = [] | ||
| for (branch_type, branch_list) in bs.branches | ||
| for br in branch_list | ||
| push!(branch_info, Dict("name" => PSY.get_name(br), "type" => string(branch_type))) |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| push!(branch_info, Dict("name" => PSY.get_name(br), "type" => string(branch_type))) | |
| push!( | |
| branch_info, | |
| Dict("name" => PSY.get_name(br), "type" => string(branch_type)), | |
| ) |
|
|
||
| if haskey(reduction_data, "radial_reduction") | ||
| rd = reduction_data["radial_reduction"] | ||
| rc.radial_reduction = RadialReduction( |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| rc.radial_reduction = RadialReduction( | |
| rc.radial_reduction = RadialReduction(; |
|
|
||
| if haskey(reduction_data, "degree_two_reduction") | ||
| d2d = reduction_data["degree_two_reduction"] | ||
| rc.degree_two_reduction = DegreeTwoReduction( |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| rc.degree_two_reduction = DegreeTwoReduction( | |
| rc.degree_two_reduction = DegreeTwoReduction(; |
| d2d = reduction_data["degree_two_reduction"] | ||
| rc.degree_two_reduction = DegreeTwoReduction( | ||
| irreducible_buses = collect(Int, d2d["irreducible_buses"]), | ||
| reduce_reactive_power_injectors = Bool(d2d["reduce_reactive_power_injectors"]), |
There was a problem hiding this comment.
[JuliaFormatter] reported by reviewdog 🐶
| reduce_reactive_power_injectors = Bool(d2d["reduce_reactive_power_injectors"]), | |
| reduce_reactive_power_injectors = Bool( | |
| d2d["reduce_reactive_power_injectors"], | |
| ), |
3c0ff40 to
1cb3523
Compare
This commit implements comprehensive serialization and deserialization of NetworkReductionData within PTDF HDF5 files, addressing the issue where network reduction maps were lost during serialization, rendering the PowerNetworkMatrices less useful.
Changes:
Implementation Details:
Limitations:
Testing:
Fixes issue where PowerNetworkMatrices became less useful after deserialization due to lost network reduction information.
Thanks for opening a PR to PowerNetworkMatrices.jl, please take note of the following when making a PR:
Check the contributor guidelines