STEPS TO REPRODUCE 1. Create a new document (with:3508, height: 4960, Resolution: 300) with 32-bit float/channel 2. Create a new document (with:3508, height: 4960, Resolution: 300) with 16-bit float/channel. 3. Crash, if not, keep creating documents with different deeps. OBSERVED RESULT Doesn't crash without "Canvas acceleration" enabled or without "Use texture buffer" enabled. Could be that the texture buffer capacity gets exhausted? SOFTWARE VERSIONS Version: 4.2.0-pre-alpha (git 0104ab7) Qt Version (compiled): 5.9.3 Version (loaded): 5.9.3 OS Information Build ABI: x86_64-little_endian-llp64 Build CPU: x86_64 CPU: x86_64 Kernel Type: winnt Kernel Version: 10.0.10586 ADDITIONAL INFORMATION OpenGL Info Vendor: Intel Renderer: "Intel(R) HD Graphics 4400" Version: "3.0.0 - Build 10.18.15.4248" Shading language: 1.30 - Build 10.18.15.4248 Requested format: QSurfaceFormat(version 3.0, options QFlags<QSurfaceFormat::FormatOption>(DeprecatedFunctions), depthBufferSize 24, redBufferSize -1, greenBufferSize -1, blueBufferSize -1, alphaBufferSize -1, stencilBufferSize 8, samples -1, swapBehavior QSurfaceFormat::SwapBehavior(DoubleBuffer), swapInterval 0, profile QSurfaceFormat::OpenGLContextProfile(CompatibilityProfile)) Current format: QSurfaceFormat(version 3.0, options QFlags<QSurfaceFormat::FormatOption>(DeprecatedFunctions), depthBufferSize 24, redBufferSize 8, greenBufferSize 8, blueBufferSize 8, alphaBufferSize 8, stencilBufferSize 8, samples 0, swapBehavior QSurfaceFormat::SwapBehavior(DoubleBuffer), swapInterval 1, profile QSurfaceFormat::OpenGLContextProfile(NoProfile)) Version: 3.0 Supports deprecated functions true is OpenGL ES: false QPA OpenGL Detection Info supportsDesktopGL: true supportsAngleD3D11: true isQtPreferAngle: false overridePreferAngle: true Internal Memory aprox. 4148MB (according to dxdiag) Using Direct3D 11 via ANGLE (Auto selected by krita) Probably related to https://bugs.kde.org/show_bug.cgi?id=393509
Yes, it's certainly possible and even likely that you're running out of some system resources.
I think that this isn't something we actually can do much about... We're not getting any exceptions or notifications that we're running out of gpu memory so we can't guard against it.