2008年11月27日星期四

Re: [fw-db] Parsing through large result sets

I did a bit of testing and can reproduce your error using the while
loop, though I chased it down to the actual PDO Mysql lib which gives
the same problem. I tried using PDO::MYSQL_ATTR_USE_BUFFERED_QUERY but
it seems to make no difference. I would suggest maybe using some
pagination, or not using PDO...

I cant find many docs about large resultsets in pdo, maybe its a bug
not sure :) If I get it working I will post the method...

Thx

2008/11/27 janpolsen <janpolsen@gmail.com>:
>
> What monk.e.boy has written in the same thread is actually one of the things
> I need to do.
> I have a large table with securities and another even large table with
> different types of prices related to securities.
>
> Making a join on those two tables results in one line with 15-20 columns:
> first column is the secuirty id, second column is a date and the rest of the
> columns are a value for each of the different price types. At the moments
> this gives me about 20.000 rows.
>
> Trying to dump all that data to a i.e. csv file will trigger the
> out-of-memory error.
> I would've loved to be able to just do a:
> file_put_contents($filename, $db->fetchAll($sql));
> ... but the fetchAll() understandably triggers the out-of-memory.
>
> Hence the while ($row = $db->fetch())-loop as I have been used to, but which
> sadly also triggers an out-of-memory (have in mind that I have memory_limit
> set to 512MB) if I use ZF.
>
> I hope you can follow me :)
>
>
> keith Pope-4 wrote:
>>
>> Could you describe what you are trying to do, how many rows etc. I
>> wouldn't mind to see if there is a workaround.
>
> --
> View this message in context: http://www.nabble.com/Parsing-through-large-result-sets-tp20264357p20718171.html
> Sent from the Zend DB mailing list archive at Nabble.com.
>
>

--
----------------------------------------------------------------------
[MuTe]
----------------------------------------------------------------------

没有评论: