Siebel Tools >  Product export error: Too many records that could be returned (SBL-DAT-00500)

Product export error: Too many records that could be returned (SBL-DAT-00500)


Fehler beim Produktexport: Es gab mehr Zeilen, als zurückgegeben werden konnten


Error pattern 


The product export in a newly created environment will run the following errors:
The' NextRecord' method of Business Component' Object Rule Nodes ImpExp BC' (Integration Component' Object Rule Nodes ImpExp BC') returned the following error:
"Too many records that could be returned . Refine your query so that fewer rows are retrieved.(SBL-DAT-00500)

Cause
The DSMaxFetchArraySize and MaxCursorSize are not sufficient for the number of records to be exported.

Solution:

Set the DSMaxFetchArraySize and MaxCursorSize to -1 .

Oracle Support delivered the following solution in November 2010:

Change the file siebsrvr\BIN\ENU\siebel.cfg (for local development thick Client):

[ServerDataSrc]
MaxCursorSize = -1
MaxFetchArraySize = -1

[ServerDataSrc]
MaxCursorSize = %MAXCURSORSIZE%
MaxFetchArraySize nicht vorhanden

Change Siebel webclient:

Site map -> Administration - Server configuration -> Enterprise -> Profile configuration
Search for ServerDataSrc in the Alias column.
In the lower applet ("SA-VBC Parameter Named Subsystem") click on the "Advanced" button and search for DSMaxFetchArraySize in the Alias column.


DSMaxFetchArraySize

-1

0

Then restart the server.

 

Solution with E-Script:
Joshua Weir from Australia has shown a workaround using E-Scripting:

For example the following script would error if the number of records returned in the query is greater than the Maximum Cursor Size:

var boTest;
var bcTest;
var nRecChk;

try {
boTest = TheApplication().GetBusObject(“Test”);
bcTest = boTest.GetBusComp(“Test”);
with (bcTest)
{
ClearToQuery();
SetViewMode(AllView);
ExecuteQuery(ForwardOnly);
nRecChk = FirstRecord();
while (nRecChk)
{
//scrolling through all records
nRecChk = NextRecord();
}
}
}
catch(e)
{
TheApplication().RaiseErrorText(e.toString());
}
finally
{
bcTest = null;
boTest = null;
}


You can workaround this problem by querying the data set in chunks and re query after each data chunk has been looped through. For example the above code can be modified as such:

 

var boTest;
var bcTest;
var nRecChk;
var nCursorCounter;
var nMaxCursorSize;
var bRecordsToProcess;
var strRecordLastAccessed;

try {
nCursorCounter = 0;
nMaxCursorSize = 5000; //the max cursor size is 5000 records
strLastRecordAccessed = 0;
bRecordsToProcess = true;

while (bRecordsToProcess)
{
boTest = TheApplication().GetBusObject(“Test”);
bcTest = boTest.GetBusComp(“Test”);
with (bcTest)
{
ClearToQuery();
SetViewMode(AllView);
SetSearchExpr(“[Id] > '" + strLastRecordAccessed + "'”);
SetSortSpec("Id (ASCENDING)");
ExecuteQuery(ForwardOnly);
nRecChk = FirstRecord();
while (nRecChk)
{
nCursorCounter++;

strLastRecordAccessed = GetFieldValue(“Id”);

//scroll through records
if (nCursorCounter < nMaxCursorSize)
nRecChk = NextRecord();
else
nRecChk = false;
}
}

//if the number of records processed is less than
//cursor size then there is no records left to process
if ( nCursorCounter < nMaxCursorSize )
bRecordsToProcess = false;

nCursorCounter = 0;//reset the counter
}
}
catch(e)
{
TheApplication().RaiseErrorText(e.toString());
}
finally
{
bcTest = null;
boTest = null;
}