Quantcast
Channel: SCN: Message List
Viewing all articles
Browse latest Browse all 3570

Large size of BC_JMSQUEUE

$
0
0

Hi,

 

We have a problem where in the number of records in the table BC_JMSQUEUE has grown to 4 million records. Can anyone share some best practices on how to keep this number low? Is it advisable to delete records from this table?

 

 

To give you guys a better idea, we have implemented SAP Manufacturing Execution 6.1.5.6 and one of our customizations is to make use of JMS Queues. The flow is described below:

 

1. XMLs messages come in and get put on PRIMARY queues depending on the type of message it is.

2. MDB consumers listening to the PRIMARY queues process the messages.

3. If an error occurs during processing, the message gets transferred to a designated ERROR queue.

 

We also have a customized application that is able to browse and then delete the messages on the error queues if needed.

 

We've noticed the below behavior with regards to the BC_JMSQUEUE table.

 

a. Whenever a message gets put on the PRIMARY queue, a record is inserted into the BC_JMSQUEUE table. Once the processing is successful, the message is consumed and is no longer accessible, however the record still remains on BC_JMSQUEUE.

 

b. Whenever a message gets put on the PRIMARY queue, a record is inserted into the BC_JMSQUEUE table. When an error is encountered and the message is moved to the ERROR queue another record is inserted into the table. For this particular case 2 records were inserted into the table.

 

c. If we delete the message from the ERROR queue (via forced consumption using receiveNoWait() function), Only 1 record gets removed from BC_JMSQUEUE table, presumably this was the one that was inserted when the message was moved to the ERROR queue.

 

Appreciate all the help that you guys can provide.


Viewing all articles
Browse latest Browse all 3570

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>