table. Even in Mysql this is a bad idea and is why it has a safe mode
that stops this behaviour.
2008/11/27 janpolsen <janpolsen@gmail.com>:
>
> Am I really the one and only person who has run into this (in my eyes) major
> problem with Zend_db?
>
> I really can't believe it's intentional and hope that it's just me who
> doesn't know how to do it correct :(.
>
>
> janpolsen wrote:
>>
>> Oi<br />
>> <br />
>> Now I have googled through various search queries for the last two hours
>> and I'm about to toss the towel.<br />
>> <br />
>> I am used to do something like this:<br />
>> <pre>$sql = "SELECT something FROM
>> random-table-with-an-obscene-large-amount-of-entries";
>> $res = mssql_query($sql);
>> while ($row = mssql_fetch_array($res)) {
>> // do some with the data returned in $row
>> }</pre>
>> <br />
>> Now I have moved over to Zend_Db and want to do the very same thing, but
>> how?<br />
>> Some places I use a simple straight through approach like:<br />
>> <pre>$sql = "SELECT something FROM
>> random-table-with-a-small-amount-of-data";
>> $rows = $db->fetchAll($sql);
>> foreach ($rows AS $row) {
>> // do some with the data returned in $row
>> }</pre>
>> <br />
>> However I run into memory problems, if I use that approach with my
>> <code>random-table-with-an-obscene-large-amount-of-entries</code>.<br />
>> <br />
>> What am I missing here? Isn't it possible to use a
>> <code>while()-structure</code> to loop through a Zend_Db result set?
>>
>> Thanks in advance...
>>
>
> --
> View this message in context: http://www.nabble.com/Parsing-through-large-result-sets-tp20264357p20716743.html
> Sent from the Zend DB mailing list archive at Nabble.com.
>
>
--
----------------------------------------------------------------------
[MuTe]
----------------------------------------------------------------------
没有评论:
发表评论