| Summary: | Cannot import massive data | ||
|---|---|---|---|
| Product: | [Applications] LabPlot2 | Reporter: | Martin Tlustos <martin.tlustos> |
| Component: | general | Assignee: | Alexander Semke <alexander.semke> |
| Status: | RESOLVED FIXED | ||
| Severity: | major | ||
| Priority: | NOR | ||
| Version First Reported In: | 2.9.0 | ||
| Target Milestone: | --- | ||
| Platform: | Neon | ||
| OS: | Linux | ||
| Latest Commit: | https://invent.kde.org/education/labplot/-/commit/66d07cef16c82a889d459d5deea12480dfefa5d4 | Version Fixed/Implemented In: | 2.10.1 |
| Sentry Crash Report: | |||
|
Description
Martin Tlustos
2023-04-06 14:29:16 UTC
Thank you for reporting this issue! I did a small fix now to reduce the peak memory consumption and also not to crash when running out of memory. We'll bring the fix into the next patch release of 2.10. More optimizations are possible here, we'll track these activities in https://invent.kde.org/education/labplot/-/issues/549. My test data was a bit bigger than in your example (150k rows and 20 columns) and the peak memory consumption was at 514MB. If labplot is crashing with your data, it means the memory consumption on your system is already very high or you're working maybe with a lot of text data. The file import is more efficient. So, ideally we address the other bug ticket that you raised. Please check my reply there. |