← Back to team overview

maria-discuss team mailing list archive

Re: Help with history tables

 

reported to maria-developers list


2014-05-21 19:51 GMT-03:00 Roberto Spadim <roberto@xxxxxxxxxxxxx>:

> =] sqlite have it
> http://www.sqlite.org/lang_delete.html
>
>
> 2014-05-21 19:47 GMT-03:00 Roberto Spadim <roberto@xxxxxxxxxxxxx>:
>
> anything like
>> delete from history where channel_id=10 order by date_time desc limit
>> 10000000000 offset 50000 ?
>> i never checked offset at delete statment
>>
>>
>> 2014-05-21 18:43 GMT-03:00 Federico Razzoli <federico_raz@xxxxxxxx>:
>>
>> I would use a batch operation (a cron job or an SQL event) which is
>>> triggered when the workload is low. If the queries are still a problem, you
>>> could relax the 50000 limit and use a statistical method. For example you
>>> could periodically compute how many rows per day are inserted (globally if
>>> possible, otherwise for each channel). You could do this once a month, for
>>> example. In the meanwhile, based on the last calculated AVG, you can use
>>> faster DELETEs based on the DATE column. You can do further optimizations,
>>> like cleansing some channels on monday, others on tuesday, etc. Everything
>>> depends on your workload and the speed of your COUNT(*)s and DELETEs.
>>>
>>> Federico
>>>
>>>
>>> --------------------------------------------
>>> El mié, 21/5/14, Roberto Spadim <roberto@xxxxxxxxxxxxx> escribió:
>>>
>>>  Asunto: [Maria-discuss] Help with history tables
>>>  Para: "Maria Discuss" <maria-discuss@xxxxxxxxxxxxxxxxxxx>
>>>  Fecha: miércoles, 21 de mayo, 2014 22:32
>>>
>>>  Hi guysI have an
>>>  doubt about design
>>>  I have a history table:CREATE TABLE
>>>  HISTORY(CHANNEL INT NOT NUL DEFAULT
>>>  0,HISTORY_DATE DECIMAL(22,6) NOT NULL DEFAULT
>>>  '0' COMMENT 'UNIX TIME',
>>>  VALUE DECIMAL (22,6) NOT NULL DEFAULT
>>>  0,PRIMARY KEY
>>>  (CHANNEL,HISTORY_DATE))
>>>  my question is...i need a history of
>>>  only 50000 rows for each channelwhat's the
>>>  best method to allow a good history without many deletes,
>>>  analyse tables, and others stuffs to allow a good table/disk
>>>  usage?
>>>
>>>  i was considering a counter, for example, after
>>>  1000 inserts, run aSELECT COUNT(*) FROM HISTORY
>>>  WHERE CHANNEL={channel_id}
>>>  get this number of rows, if it's > 50000
>>>  run:
>>>
>>>  DELETE FROM HISTORY WHERE CHANNEL={channel_id}
>>>  ORDER BY HISTORY_DATE ASC  LIMIT (number_of_rows -
>>>  50000)
>>>  any other idea?
>>>  maybe a 'batch' operation?
>>>  i was think about something like (don't work, just
>>>  a idea)
>>>  DELETE FROM HISTORY WHERE CHANNEL IN
>>>  (channel numbers)AND xxxxx? LIMIT 50000 GROUP BY
>>>  CHANNEL
>>>
>>>  i don't know anything to group by channel id
>>>  and delete rows using only one sql command , any idea?
>>>
>>>
>>>  thanks guys!
>>>  --
>>>  Roberto Spadim
>>>
>>>  SPAEmpresarialEng. Automação e Controle
>>>
>>>  -----Adjunto en línea a continuación-----
>>>
>>>  _______________________________________________
>>>  Mailing list: https://launchpad.net/~maria-discuss
>>>  Post to     : maria-discuss@xxxxxxxxxxxxxxxxxxx
>>>  Unsubscribe : https://launchpad.net/~maria-discuss
>>>  More help   : https://help.launchpad.net/ListHelp
>>>
>>>
>>
>>
>> --
>> Roberto Spadim
>> SPAEmpresarial
>> Eng. Automação e Controle
>>
>
>
>
> --
> Roberto Spadim
> SPAEmpresarial
> Eng. Automação e Controle
>



-- 
Roberto Spadim
SPAEmpresarial
Eng. Automação e Controle

References