O.K. Thank you for your reply. I think there is nothing in apache logs, but
anyway, this scripts will be runned from CLI with crons, so I guess they
will work out nice, thanks.
Regards,
Saša Stamenković
Bugzilla from dado@krizevci.info wrote:
>
> Hello to you and the list (my first mail here),
>
> On Tuesday 26 May 2009 11:24:00 umpirsky wrote:
>
>> I have model which extends Zend_Db_Table_Abstract, and I'm inserting more
>> than 10000 records in table using Zend_Db_Table_Abstract::insert() in one
>> foreach loop. I get Service Temporarily Unavailable 503 error. I
>> profiled,
>> there is nothing wron with queries. For less than 7000 it works perfect.
>>
>> It's not memory limit problem, it's not max exec time problem.
>>
>> Any idea?
>
> By the error, you're running this code through a web server. If it's
> Apache,
> it could be that the RLimitCPU is set (which is similar to PHP's max exec
> time), try to check Apache logs if you have access to it.
>
> Other thing: you shouldn't really push bulk data to a DB like this, the
> absolute fastest way I've found is to import the data from the dump file.
> If
> your user is uploading the file to import, just dump it to a dir on the
> server
> and run your import script with cron, like every minute or five. You
> generate
> the dump file and then load it with DBMS tools for that. This approach can
> handle really large data sets.
>
> HTH,
>
> --
> Dado
>
>
--
View this message in context: http://www.nabble.com/Error-503-while-inserting-many-records-tp23719594p23720584.html
Sent from the Zend DB mailing list archive at Nabble.com.
没有评论:
发表评论