Has cftransaction ever gotten in the way of your c...
# cfml-general
d
Has cftransaction ever gotten in the way of your code? Code will be explained in thread...
Copy code
<cftransaction action="begin">
	<cfloop query="getEmployees">
        <cfset employee.jobInfo = ultipro.GetJobByEmployeeIdentifier(getEmployees.cb_emp_n)> <!--- cfhttp() + xmlparse() --->
        <cfquery name="qry" datasource="mainds">
            insert into ....
        </cfquery>
    </cfloop>
    
    <cftransaction action="commit">
</cftransaction>
d
You definitely want to avoid doing costly operations inside a transaction block. This will lock your tables. Since the GetJobByEmployeeIdentifier() seems to do some expensive operations (HTTP & parse XML), I would recommend you split the processes. Get all the data in one step and then try to insert the data (preferably in bulk) in the second step.
d
Line 3: the GetJobBy... method does an http request to an external API url and is returns xml. then it uses xmlParse() to get a struct of employee info. The error occurs on the xmlParse() with "Content is not allowed in prolog." So I thought maybe it was a BOM issue or some other type of characters at the start of the response body. But all I was getting was "Connection Failure" when I looked at the cfhttp.FileContent. Line 2: the loop is processing about 600 employees. and I confirmed that xmlParse will not fail for the first 20 to 30 records. First, I noticed that the transaction here is not needed. So removing the transaction code solved the issue. I no longer get "Connection Failure" on some of those http requests. BUT WHY DOES CFTRANSACTION CAUSE CFHTTP to throw CONNECTION FAILURE error??
d
If the HTTP call is ending up querying the same database/table, then the transaction block can lead to the database locking the table, which can explain why some records work and then it breaks.
d
@dswitzer Splitting the retrieval of the data and the xml parse sounds questionable. But I'd do it if it makes sense. So do you recommend I persist 600 http responses into memory , then xmlparse the 600 responses into memory, then finally cfquery all 600 records?
the HTTP call is doing an API service call to an external source and returning xml for each.
d
I would get the records, do the XML parse and then extra the data you need for your insert into a data structure. Then loop through the data structure to insert it. Hopefully you can do that in a bulk operations.
d
@dswitzer i can't get the specific employee info I need in bulk. I have to make an http request for each employee. So this means 600 http requests. So I would have to persist 600 xml responses first before I do any xmlparsing.
d
Do the HTTP & XML parsing in one step, store the results of your XML parsing in a data structure.
Copy code
<cfset employees = [] />
<!---// get the employee data //--->
<cfloop query="getEmployees">
  <cfset arrayAppend(employees , ultipro.GetJobByEmployeeIdentifier(getEmployees.cb_emp_n))> <!--- cfhttp() + xmlparse() --->
</cfloop>

<!---// now save the data //--->
<cfloop array="#employees#" index="employee">
   <!---// do insert //--->
</cfloop>
Preferably the save data would do it in bulk (instead of 1 insert per employee, it would insert multiple records in one operation, but the syntax for that varies on database engine and version)
d
ah well much simpler than how I thought you meant.
s
if your database support prepared statements, you may also be able to improve performance by doing that instead of 600 individual cfqueries
I used prepared statements frequently in mass import types of processes
d
mssql. and I've never heard of "prepared statements"
s
hmm looks like things work a little diferently in ms sql than postgress and mysql (which is what I usually use)
looks like you can do something similar with sp_prepare/sp_execute
d
ok I found the docs on sp_prepare and sp_execute. interesting approach to improve performance.
s
in essence though it just makes it so you can run a bunch of the same query without the db habing to create the query plan and without having to re-connect and all that for every insert
d
@dswitzer @Scott Bennett any idea how cftransaction is causing http connection failures. this error is SOOOO misleading.
or is it just a problem of the language error handling?
d
I don think it's causing your Connection Issues. I think it's causing the HTTP script you're calling to hang, which is leading to the connection timeout
s
yeah, probably the cftransaction is catching the http error and then rolling back the transaction would be my guess
d
Since you're using MSSQL, you can batch insert multiple rows using this syntax: https://www.sqlservertutorial.net/sql-server-basics/sql-server-insert-multiple-rows/
That should speed things up and reduce table locking
d
ah yes I've done this before. wow its been such a long time I forgot about this. Thanks @dswitzer
s
I sometimes will also batch those types of queries in groups of 50 or 100
(so as not to create a query string that is too long)
when dealing with hundres or thousands of records that can cause issues with the length of the sql string that get's generated
d
perfect. great tip. thanks.
s
you can try OPENJSON with CTE for multiple insert
g
@sana I love CTEs!