endless data request when using post with datatable

Hi,

When I tried to use post with datatable, it will send endless data request to server, it seems a serious bug.

Steps to re-produce it:

  1. Open below sample, it’s cloned from an online sample, and changed some properties like size, datafetch, url (post->):
    https://snippet.webix.com/8lp2ez6r

  2. Click any page > 1, like Page 2, Page 7, etc

  3. You will see endless (dead loop) data request sent to server

It’s tested in latest Google Chrome.
It seems GET method is working ok.

Regards
Wicky

Hello,

Our proxy doesn’t provide for dynamic loading, so it’s can be done manually.

Sorry, I don’t understand what you mean?

I can simulate the case by using local dynamic data as source.

The key is: set both datafetch and pager size, use POST instead of GET.

Per my understanding, datatable should support both GET and POST, right?

Regards
Wicky

Hi,

I understood your words “Our proxy doesn’t provide for dynamic loading”.

But for the case: when reqeusted data was not expected, should it be better to raise an error instead of sending endless data request?

And again, as I tried, POST still not working for below setting (example):

datafetch = 50
pager size = 25
url = post->…

server side will just pick 50 records and return
pos = start
data = 50 records
total_count = total records count

After click some pages, for example, Page 2->3->8->7,
then suddenly start endless data request.

Could you provide an working online post sample (datatable)?

Regards
Wicky

check if your data items have repeating id property.
in this case endless request is very possible.

Hi,

I have ID column in table as primary key, so it’s not possible to have duplicated ID.
And column id in datatable is also unique.

it seems when clicking even page and the previous odd page, endless request will be sent out, for example:

if click page 1->2->3->4->5…, then it’s ok, no problem
if click page 4->3, or page 8->7 (need to be first click without cache), endless data request

According to browser F12 network monitoring, I guess:

  1. datafetch is 50 and size is 25, so two pages data will be fetched by one request
  2. when first click page 4, data of page 4 and 5 will be fetched
  3. when first click page 3, data of page 3 and page 4 will be fetched
  4. the overlap of page 4 data is fetched but somehow not handled properly
  5. endless request sent to server to get page 3 and page 4, dead loop

Regards
Wicky

Hi,

Thanks to intregal! You pointed out the truth!

After carefully tracing and comparing the data items, I finally found a mistake at server side script, which could lead to duplicated items fetched (comparing to already loaded items in datatable of client side).

So the endless data request was caused by data items fetched are already existed in client side. The problem was resolved after server side script fixed.

After review the case, I suggest to add some checking to datatable, so the endless data request might be avoided, I think datatable shouldn’t be breaked by mistake of data.

For example:

  1. When loading is in progress, and a new load request with same start (and count?) triggered, the new load request shouldn’t be queued (in _feed function?), and user need to be warned in debug version

  2. If server return unwanted data (pos<>start?), should stop requesting and warn user

  3. If server return items which are duplicated (all existed in client side or start
    item duplicated), should stop requesting and warn user

Thanks and Best Regards
Wicky Hu

Just for your information, I have met two cases of datatable dead loop requests:

case 1:

  1. server paging
  2. column id existed
  3. server return duplicated id when paging (normally happens when no sorting by id)
    if datatable cannot get required records, it will just repeatly to requst until browser hang

case 2:

  1. server paging
  2. column id existed
  3. external filter registered, but not applied on changed
  4. using proxy in url
  5. datatable will request first set of data at initial (for example total records = 1000 )
  6. input value into the external filter (not applied, but will limit total records, for example new total records = 5 )
  7. click page 2->3->4, once datatable starts to request data, it will repeatly to request until browser hang
  8. the reason is new filter applied and total records changed, however client side total records not changed, datatable just repeatly request non-existing page
  9. in the requests, “continue” is always true, it seems the filter change is not checked for new server request

Regards
Wicky