feat: bulk data import/export commands#7
Closed
chrisaddams wants to merge 2 commits intomainfrom
Closed
Conversation
- Add comprehensive bulk operations namespace with core interfaces - Implement streaming CSV and JSON data readers with robust parsing - Add progress tracking infrastructure with rich console output - Create bulk import command with validation and error handling - Add bulk export functionality with multiple format support - Implement resilient bulk processor with parallel execution - Add data validation framework with configurable rules - Create comprehensive test suite for all components - Support for large datasets with memory-efficient streaming - Rich CLI integration with progress reporting and error handling This framework provides enterprise-grade bulk data operations with streaming support, parallel processing, and comprehensive error handling. Addresses critical user needs for importing/exporting large datasets efficiently and reliably.
- Remove duplicate test files, test data, and standalone test project - Fix BulkDataProcessor elapsed time calculation (was nonsensical) - Fix export pipeline: stream through exporter directly instead of no-op processor - Replace full-file stream reads with file-size estimate for CSV record count - Remove double-parse in JsonDataReader that defeated streaming - Add 1s regex timeout in BasicDataValidator to prevent ReDoS - Fix JsonValue unwrapping in validator (int/long/bool were not extracted) - Fix all compiler warnings (async without await, nullable reference) - Convert tests to proper xUnit with FluentAssertions (13 tests, all passing)
Contributor
Author
|
Closing in favour of keeping the fixes on #3 (the original contributor's PR) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
data importcommand: import CSV/JSON files into entities with validation, progress tracking, and retry logicdata exportcommand: export entity data to CSV/JSON with streaming and paginationBased on #3 by @nivedhapalani96 with the following fixes:
Test plan
dotnet build— 0 warnings, 0 errorsdotnet test— 205/205 passing (13 new bulk operations tests)data import test.csv entity_name --dry-rundata export entity_name output.csv