View Issue Details
ID | Project | Category | View Status | Date Submitted | Last Update |
---|---|---|---|---|---|
0000101 | Heureka | Other | public | 2017-02-14 09:49 | 2017-09-05 16:26 |
Reporter | Peder | Assigned To | |||
Priority | normal | Severity | minor | Reproducibility | have not tried |
Status | closed | Resolution | fixed | ||
Product Version | 2.6.0 | ||||
Fixed in Version | 2.6.1 | ||||
Summary | 0000101: Database error when calculating common border length for very large dataset | ||||
Description | Beräkning av gemensam kantlängd för 50000 bestånd ger undantagsfel. Bulk-insert behövs? Testdata finns under Trunk/Testdata/Hexagoner. Importera som beståndsregister + karta. "The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that reference a very large number of tables or partitions. Please simplify the query. If you believe you have received this message in error, contact Customer Support Services for more information." Source: .Net SqlClient Data Provider Stacktrace: at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData() at System.Data.SqlClient.SqlDataReader.get_MetaData() at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method) at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method) at Slu.Heureka.DomainLayer.GIS.ShapeReader.GetPolygons(DBInfo dbinfo, IEnumerable`1 treatmentUnitGuids) at Slu.Heureka.DomainLayer.GIS.SpatialRelations.ShapeNeighbour..ctor(DBInfo dbInfo, IEnumerable`1 tuColl) at Slu.Heureka.DomainLayer.GIS.SpatialRelations.AdjacenciesCommand.Do() | ||||
Tags | No tags attached. | ||||
Product | PlanWise | ||||
|
The problem was that ShapeReader GetPolygons did a IN clause with 500000 treatmentunis. The solution was to handle them in batches. |
Date Modified | Username | Field | Change |
---|---|---|---|
2017-02-14 09:49 | Peder | New Issue | |
2017-02-14 14:50 | Peder | Description Updated | |
2017-02-15 09:08 | Peder | Description Updated | |
2017-02-15 09:14 | Peder | Assigned To | => pst |
2017-02-15 09:14 | Peder | Status | new => assigned |
2017-02-15 15:29 |
|
Note Added: 0000105 | |
2017-02-15 15:30 |
|
Status | assigned => resolved |
2017-02-15 15:30 |
|
Fixed in Version | => 2.6.1 |
2017-02-15 15:30 |
|
Resolution | open => fixed |
2017-09-05 16:26 | Peder | Status | resolved => closed |