Skip to main content

Posts

Showing posts with the label curl

Copy Image from Remote Server Over HTTP

I am looking for a simple way to import/copy images from remote server to a local folder using PHP. I have no FTP access to the server, but all remote images can be accessed via HTTP (i.e. http://www.mydomain.com/myimage.jpg ). Example use: A user wishes to add an image to his profile. The image already exists on the web and the user provides with a direct URL. I do not wish to hotlink the image but to import and serve from my domain. Source: Tips4all

POST a file string using cURL in PHP?

I was wondering if it is possible to post a file - along with other form data - when the file is just a string? I know that you can post a file that is already on the filesystem by prefixing the filepath with "@". However I'd like to bypass creating a temporary file and send just the file as a string, but I am unsure how to construct the request using cURL in PHP. Cheers $postFields = array( 'otherFields' => 'Yes' ,'filename' => 'my_file.csv' ,'data' => 'comma seperated content' ); $options = array( CURLOPT_RETURNTRANSFER => true ,CURLOPT_SSL_VERIFYPEER => false ,CURLOPT_SSL_VERIFYHOST => 1 ,CURLOPT_POSTFIELDS => $postFields ,CURLOPT_HTTPHEADER => array( 'Content-type: multipart/form-data' ) ); Source: Tips4all

Send SOAP XML via curl, PHP

This has been bugging me for days, i'm trying to send a SOAP post via curl but i just keep getting a 'couldn't connect to host' error but i really cant see how. I have an asp version which works fine with the same url and data, i think it's just a php/curl thing...? I currently have the following code (the CURLOPT_POSTFIELDS data is a valid soap envelope string) $soap_do = curl_init(); curl_setopt($soap_do, CURLOPT_URL, "https://xxx.yyy.com:517/zzz.asmx" ); curl_setopt($soap_do, CURLOPT_CONNECTTIMEOUT, 10); curl_setopt($soap_do, CURLOPT_TIMEOUT, 10); curl_setopt($soap_do, CURLOPT_RETURNTRANSFER, true ); curl_setopt($soap_do, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($soap_do, CURLOPT_SSL_VERIFYHOST, false); curl_setopt($soap_do, CURLOPT_POST, true ); curl_setopt($soap_do, CURLOPT_POSTFIELDS, '<soap:Envelope>...</soap:Envelope>'); curl_setopt($soap_d

cURL equivalent in JAVA

I am tasked with writing an authentication component for an open source JAVA app. We have an in-house authentication widget that uses https. I have some example php code that accesses the widget which uses cURL to handle the transfer. My question is whether or not there is a port of cURL to JAVA, or better yet, what base package will get me close enough to handle the task? Update : This is in a nutshell, the code I would like to replicate in JAVA: $cp = curl_init(); $my_url = "https://" . AUTH_SERVER . "/auth/authenticate.asp?pt1=$uname&pt2=$pass&pt4=full"; curl_setopt($cp, CURLOPT_URL, $my_url); curl_setopt($cp, CURLOPT_RETURNTRANSFER, 1); $result = curl_exec($cp); curl_close($cp); Heath , I think you're on the right track, I think I'm going to end up using HttpsURLConnection and then picking out what I need from the response.

disable form action redirect when using curl to load a page

Could someone give me an idea on how not to redirect the page when using cUrl. I mean, i got this code: $url = "http://somesite.com/submit.php?name=" . $email . "&email=" . $email; $ch = curl_init(); curl_setopt($ch,CURLOPT_URL, $url); curl_setopt ($ch, CURLOPT_RETURNTRANSFER, false); $curl_output = curl_exec ($ch); curl_close ($ch); echo "Hello!"; What happens is that inside the submit.php there is a form that autosubmits using javascript. the submit.php does not continue since it will redirect to the form's action page. What curl options should I add on the code to make submit.php continue without the form action taking over the page and run the rest of the code of submit.php ? submit.php <?php $name = $_GET['name']; $email = $_GET['email']; ?> <form method="post" action="actionpage.php" name="myform"> .... </form> <script type="text/javascript" lan

PHP cURL timeout is not working

I'm having a server issue. I'm running a local server (for developing) and I've changed my local server from MAMP to XAMPP. However, on XAMPP, the cURL option CURLOPT_TIMEOUT_MS or CURLOPT_CONNECTTIMEOUT_MS gives me the next error: Warning: curl_setopt() expects parameter 2 to be long, string given Is this because of the PHP or cURL version? Maybe a configuration setting? curl_setopt($this->ch, CURLOPT_CONNECTTIMEOUT_MS, 2500); Additional information: OSX 10.6.8 PHP 5.3.1 cURL 7.19.7 Thanks in advance. Edit: There seems to be some confusion about the error and the variable to set. The error states that parameter 2 is invalid. Not parameter 3 . So the CURLOPT_CONNECTTIMEOUT_MS seems to be the issue. curl_setopt($this->ch, CURLOPT_CONNECTTIMEOUT_MS, 2500); ^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^ parameter: #1 #2 #3 Fun fact: var_dump(CURLOPT_CONNECTTIMEOUT_MS); displays string(25) &

Posting Data to Amazon using curl

when we enter a keyword (search term) in amazon kindle book store, search brings a list of books back. I am trying to scrap the above said amazon kindle book search, i have some basic/usage idea of CURL in php, but i have never posted variables using curl. I tried at my level. but havent succeeded yet. the only thing i know is i should post "key-word" variable to amazon and should grab the result. The problem with this step is that the key-word is submited to the form and only a part of the page is refreshed everytime key-word is entered. can somebody help me telling the step required? which data i will need to post icluding key-word? how can i know about the header /user agent required for this? what information will be required for this process? which elements will be posted ? I have tried using fiddler but as i am new to it, i am not getting the concepts. The link i want to parse is amazon.com Guideline please, if i get the what need

Find a url in content a site by url it?

I want search into content a site by url it site, if existence my url (for example: http://www.mydomain.com/ ) return it is TRUE else it is FALSE. If existence url as following list, Return it is FALSE: - http://www.mydomain.com/blog?12 - www.mydomain.com/news/maste.php - http://www.mydomain.com/mkds/skas/aksa.html - www.mydomain.com/ - www.mydomain.com I want just accsept(find) as(only): http://www.mydomain.com/ OR http://www.mydomain.com I tried as: $url = 'http://www.usersite.com'; $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,$url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $contents = curl_exec($ch); curl_close($ch); $link="/http:\/\/mydomain.com/"; if(preg_match("/". preg_quote($link,"/"). "/m", $contents) && strstr($contents,"http://www.mydomain.com")){ echo 'TRUE'; } else{ echo 'FALSE'; } But it doesn't worked, for it w

PHP-CURL curl_multi_exec is it really multithreaded internally?

My question is very simple as stated in title. However I rephrase it again. I want to download multiple sites using php-curl. I'll run it from console. I am going to use curl_multi_exec to download all the sites. Now the question, will curl create different threads for each of the request? I know I can achieve it by forking multiple processes. But thats not threading. I dont want threading. I want to know if its multi-threaded?

Getting the HTML source code of an .swf URI

I was wondering whether one could get the HTML source code from an .swf URI? For instance a web page such as: http://media.flixfacts.com/360view/acer_uk/002/acer_uk-002-en.swf When I use curl to scrape this page it brings back the swf source not the html source. Any ideas?