After creating a new Datastore I ran in to a problem where the extractor would throw an error Error calling RFC function to get table data: <RFC_ABAP_EXCEPTION-(Exception_Key: DATA_BUFFER_EXCEEDED, SY-MSGTY: E, SY-MSGID: FL, SY-MSGNO: 046)>.  After doing a little bit of research on the matter I came across the following comment on one of the SDN pages:


For reading a table via RFC we can called RFC_READ_TABLE function. It can return rows of up to 512 characters at max (unless the Z_AW_* functions shipped with DataServices are installed in SAP) – if the list of columns is wider than that the procedure will fail – and you can even specify a where clause.

We can write our own RFC function, which can return more bytes per row and even takes an entire SQL statement as input. This way we can even join tables. But only if they can be joined inside the SQL. Thus joining pool or cluster tables is impossible. As a result, the performance when reading such an important thing like e.g. financial documents (BSEG table) will be unacceptable.

And for sure we will not be able to call SAP functions inside.

Further research lead me to an OSS note 1186277 which provided the following information:

To work around this issue, reduce the number of columns that the source extracts.

To fix this issue, Data Integrator provides a version of RFC_READ_TABLE that will allow a buffer of 2048 bytes. This will be adequate for viewing data on most SAP tables. In order to use this function, upload the Data Integrator transport files to the SAP system.

For more information on uploading the transport files please view the Data Integrator supplied Supplement for SAP. Information can be found in the “Installing Data Integrator functions on SAP R/3” section of Chapter 2.

Following the instructions I’ve read the Installing Data Integrator functions on SAP R/3” section in the “Data Integrator Supplement for SAP” document and that fixed the problem.


Leave a Reply


captcha *