6 Replies Latest reply: Nov 27, 2013 8:55 PM by kittikorn kanjanamayoon RSS

GL865 Quad default packet size?

Newbie

Hello there, been messing around with a GL865 QUAD from telit for a couple weeks now and been having problems understanding a "problem" I've been getting. I'm trying to establish TCP/IP connection by creating a java application running in a server to receive data coming in from the GSM. This server waits for a client to connect to a specific port and then handles the socket object. Now, I'm trying to send over 10KB of data across the connection, this is due to the fact of sending various JPEG images across the network. One of the problems I've been getting was when I'm sending the data through a Stellaris LM4F120XL launchpad MCU at a 115200 baudrate UART peripheral, it didn't matter what amount of data I sent, the maximum amount of data arriving in the server was that of 4096. This behavior isn't explained physically in any data sheet and I would like to know what would be the causes. Is it because of how slow I'm sending the data compared to how fast the server is receiving it, is it due to an internal working of the GSM or the MCU I'm using? I'm having to use Thread.sleep on my java application because if not the program runs too fast for the data to arrive and exits without receiving anything. Either way I kinda solved it by partitioning the data into 1024 packet size in the MCU and sending only that amount of data through UART. The problem I have now is that I'm having to work on delays both on the server side to allow for the data to arrive before my program reads the while statement and acknowledges it as no data ready to be read on InputStream and delays on the data I'm sending through the MCU, delays have to be done on every packet of 1024 bytes being sent and when data being sent totals 4096 (the past error also being noticed here). These delays allow for a bigger amount of packets to actually arrive in the server. Example, before the delays between packets of 1024, only about 8k arrived out of the 13500 im testing to send; after the delays on 4096 bytes 12k bytes of 13500 are arriving. This shows me that they are needed, but I just don't understand the pattern yet as to why. Either way I'm getting lost packets at the moment at around the 10k bytes +- 1k bytes and still missing the last 1.5k bytes, maybe adding some more delays around the 10k mark will get the cookie to crumble (more packets to arrive ).

 

Would love some advice on this behavior and if any good reads on this are around. Any thoughts or ideas are welcomed, anything that can point me in the right direction will help.

 

Thanks! Attached is the server file and the way im using delays for sending the data packets on the MCU.