>
> However that wouldn't make it work the way I want and if I want to iterate
> through a very large set of data, then that should be possible without
> using limits.
>
> It can be done without any problems whatsoever if I use the "builtin"
> mysql*-functions.
> It takes a second or so to execute the <code>$res =
> mysql_query($sql);</code> command and then I can easily and > without any
> big memory consumption do a <code>while ($row = mysql_fetch_assoc($res)) {
> ... }</code> on that result set.
>
> I am looking for the very same thing as that, however I would very much
> like to keep using the Zend Framework.'
>
> Best regards,
> Jan
Hello Users,
i got the same problem with Zend_Db Statements. It looks like Zend_Db by
default will use buffered results and give the user neither a possibility to
deactivate as in for example PDO::MYSQL_USE_BUFFERED_RESULT = false nor to
free a result without closing database (I'm checking currently). I've tried
several ours and the best way that i got is to write my own driver (just in
progress now).
Help on using Zend_Db with unbuffered results is very welcome.
With Best regards
Sven
--
View this message in context: http://n4.nabble.com/Res-Parsing-through-large-result-sets-tp674360p977782.html
Sent from the Zend DB mailing list archive at Nabble.com.
没有评论:
发表评论