2008年11月24日星期一

Re: [fw-db] Re: Re[fw-db] s: Parsing through large result sets

You need to limit your query if you have a large dataset.

2008/11/24 janpolsen <janpolsen@gmail.com>:
> Oi
>
> Now I have tested fetching large amount of data in various scenarioes, but I
> simply can't figure out how to get it work. I understand what you write
> below, but to me it looks like that PHP bails out with the "memory
> exhausted" error already when I do the ->query().
>
> I have tried 3 verious methods of fetching data:
>
> $sql = "SELECT * FROM ";
>
> $stmt = new Zend_Db_Statement_Pdo($db, $sql);
> $stmt->execute();
> while ($row = $stmt->fetch()) {
> echo $row['SecurityID'].PHP_EOL;
> }
>
> $res = $db->query($sql);
> while ($row = $res->fetch()) {
> echo $row['SecurityID'].PHP_EOL;
> }
>
> $res = $db->fetchAll($sql);
> foreach ($res AS $row) {
> echo $row['SecurityID'].PHP_EOL;
> }
>
> All three of them fails with PHP Fatal error: Allowed memory size of
> 536870912 bytes exhausted (tried to allocate 40 bytes) in
> L:\Internet\php\includes\ZendFramework-1.7.0\library\Zend\Db\Statement\Pdo.php
> on line 234
>
> Notice that I have my memory_limit in PHP set to 512MB!
>
> To me it seems like all of the three fetch-methods tries to fetch the whole
> dataset right away, instead of one row at a time.
>
> Am I using the methods wrong or?
>
> Best regards,
> Jan
>
> Luiz Fernando-4 wrote:
> You can do this: $sql = "SELECT something FROM
> random-table-with-an-obscene-large-amount-of-entries"; $res =
> $db->query($sql); while ($row = $res->fetch(Zend_Db::FETCH_NUM)) { // do
> some with the data returned in $row } ________________________________ De:
> janpolsen Para: fw-db@lists.zend.com Enviadas: Sexta-feira, 31 de Outubro de
> 2008 8:48:49 Assunto: [fw-db] Parsing through large result sets Oi Now I
> have googled through various search queries for the last two hours and I'm
> about to toss the towel. I am used to do something like this: $sql = "SELECT
> something FROM random-table-with-an-obscene-large-amount-of-entries"; $res =
> mssql_query($sql); while ($row = mssql_fetch_array($res)) { // do some with
> the data returned in $row } Now I have moved over to Zend_Db and want to do
> the very same thing, but how? Some places I use a simple straight through
> approach like: $sql = "SELECT something FROM
> random-table-with-a-small-amount-of-data"; $rows = $db->fetchAll($sql);
> foreach ($rows AS $row) { // do some with the data returned in $row }
> However I run into memory problems, if I use that approach with my
> random-table-with-an-obscene-large-amount-of-entries. What am I missing
> here? Isn't it possible to use a while()-structure to loop through a Zend_Db
> result set? Thanks in advance... ________________________________ View this
> message in context: Parsing through large result sets Sent from the Zend DB
> mailing list archive at Nabble.com. Novos endereços, o Yahoo! que você
> conhece. Crie um email novo com a sua cara @ymail.com ou @rocketmail.com.
> http://br.new.mail.yahoo.com/addresses
>
> ________________________________
> View this message in context: Re: Res: Parsing through large result sets
> Sent from the Zend DB mailing list archive at Nabble.com.
>

--
----------------------------------------------------------------------
[MuTe]
----------------------------------------------------------------------

没有评论: