Members (1)
    Pinned Items
    Recent Activities
    • brute force attacks using cURL and thus bypassing traditional lockouts on some of our CMS... brute force attacks using cURL and thus bypassing traditional lockouts on some of our CMS installations.

      I know we can disable cURL through php.ini and the Apache compiler, however I would like to know whether or not it is recommended to leave cURL active (as we would like to prevent any loss of service).

      Would disabling cURL be an unwise decision?
      More
      0
      0
      0
      0
      0
      0
      Post is under moderation
      Stream item published successfully. Item will now be visible on your stream.
    • CURL vs fopen vs file_get_contents? (self.PHPhelp)

      submitted 9 months ago by dbbeginner

      Is...
      CURL vs fopen vs file_get_contents? (self.PHPhelp)

      submitted 9 months ago by dbbeginner

      Is there a reason, performance wise, to choose one of the above methods over another (for retrieving responses from an API)?

      CURL seems like it's better suited for scraping, as it has more control of headers, etc.

      fopen, seems like a lighter weight version of CURL, still good for opening large files since it handles them on a line by line basis.

      file_get_contents, seems like it's better suited for retrieving these single-line JSON responses.

      But maybe that's just me counting the lines of code to make each call, where they can all be wrapped in a function and then called with the same amount of effort later on.

      So, I'm asking, is one better than the other? Is my interpretation of them correct? Or are there other considerations I should consider?

      I don't want to bomb someones API with 10,000 requests using each method (hahaha), but is there a noticeable difference between the different methods?

      TLDR: Community: what tool do you use to retrieve remote content and why?

      5 commentssharesavehidereport
      all 5 comments
      sorted by: best

      Want to add to the discussion?
      Post a comment!


      [–][deleted] 2 points 9 months ago
      performance wise

      Sounds like premature optimization. As others have said cURL is the right tool for this job. If you're hitting API's you will likely need to set a header with OAuth tokens, or set the useragent, or any number of things that fopen and file_get_contents are not able to do.



      permalinkembedsave

      [–]Innominate8 3 points 9 months ago
      The answer is curl because it gives you better control over timeouts.

      One day the server you're trying to contact will be unresponsive and your code will sit there and wait until it hits the configured timeout. If your code is not timing out and handling this error case, it can bring down your site as well.

      permalinkembedsave

      [–]ud_patter 1 point 9 months ago
      Curl's more flexible & allows additional HTTP methods, so fopen may be fine for GETting data but is unable to POST information to forms (or to use PATCH/PUT/DELETE) as commonly needed for REST interfaces.

      so its really a question of what your code needs to do

      bear in mind that fopen on urls may be disabled by your server administrator & that curl may also not be present, depending on the server's configuration.

      permalinkembedsave

      [–]andersevenrud 1 point 9 months ago
      With fopen you create a stream. file_get_contents will read the entire resource into memory.

      If you're handling large files/resources you should go for streams because then you won't run out of memory (well... I guess it depends exactly on what you're doing).

      cURL is a HTTP request library which allows for more than just reading resources (fopen/file_get_contents only reads/fetches remote resources).

      permalinkembedsave

      [–]colshrapnel -3 points 9 months ago
      fopen, still good for opening large files since it handles them on a line by line basis.

      there are no files in HTTP. do you really think that by calling fgets() you will make PHP to retrieve just a single "line" from the remote server? And then? Another call to fgets() will make PHP to do another call to remote, etc?

      is there a noticeable difference between the different methods?

      Performance-wise - no. Ever heard of the network latency? How do you think, will any method instantly move your code to the remote server? If not, why do you care if every one will have to download the same file over the same network?

      what tool do you use to retrieve remote content and why?

      A wrapper around CURL, Guzzle for example.

      However, just like it often happens, your questions are just out of nowhere and overthinking, while your API returns 3 bytes of data every day, then don't waste everyone's time and use file_get_contents.
      More
      0
      0
      0
      0
      0
      0
      Post is under moderation
      Stream item published successfully. Item will now be visible on your stream.
    • Design pattern
      Active-Record
      Model-View-Controller
      Dependency injection
      Observer
      Singleton
      Event...
      Design pattern
      Active-Record
      Model-View-Controller
      Dependency injection
      Observer
      Singleton
      Event-Driven
      MTV
      Factory
      RESTfull
      Facade

      More
      0
      0
      0
      0
      0
      0
      Post is under moderation
      Stream item published successfully. Item will now be visible on your stream.
    • Object-relational mapping (ORM, O/RM, and O/R mapping tool) in computer science is a programming... Object-relational mapping (ORM, O/RM, and O/R mapping tool) in computer science is a programming technique for converting data between incompatible type systems using object-oriented programming languages. This creates, in effect, a "virtual object database" that can be used from within the programming language. More
      0
      0
      0
      0
      0
      0
      Post is under moderation
      Stream item published successfully. Item will now be visible on your stream.
    • The phar extension provides a way to put entire PHP applications into a single file called a... The phar extension provides a way to put entire PHP applications into a single file called a "phar" (PHP Archive) for easy distribution and installation. In addition to providing this service, the phar extension also provides a file-format abstraction method for creating and manipulating tar and zip files through the PharData class, much as PDO provides a unified interface for accessing different databases. Unlike PDO, which cannot convert between different databases, Phar also can convert between tar, zip and phar file formats with a single line of code. see Phar::convertToExecutable() for one example.

      What is phar? Phar archives are best characterized as a convenient way to group several files into a single file. As such, a phar archive provides a way to distribute a complete PHP application in a single file and run it from that file without the need to extract it to disk. Additionally, phar archives can be executed by PHP as easily as any other file, both on the commandline and from a web server. Phar is kind of like a thumb drive for PHP applications.

      Phar implements this functionality through a Stream Wrapper. Normally, to use an external file within a PHP script, you would use include

      Example #1 Using an external file

      <?php
      include '/path/to/external/file.php';
      ?>
      PHP can be thought of as actually translating /path/to/external/file.php into a stream wrapper as file:///path/to/external/file.php, and under the hood it does in fact use the plain file stream wrapper stream functions to access all local files.

      To use a file named file.php contained with a phar archive /path/to/myphar.phar, the syntax is very similar to the file:// syntax above.

      Example #2 Using a file within a phar archive

      <?php
      include 'phar:///path/to/myphar.phar/file.php';
      ?>
      In fact, one can treat a phar archive exactly as if it were an external disk, using any of fopen()-related functions, opendir() and mkdir()-related functions to read, change, or create new files and directories within the phar archive. This allows complete PHP applications to be distributed in a single file and run directly from that file.

      The most common usage for a phar archive is to distribute a complete application in a single file. For instance, the PEAR Installer that is bundled with PHP versions is distributed as a phar archive. To use a phar archive distributed in this way, the archive can be executed on the command-line or via a web server.

      Phar archives can be distributed as tar archives, zip archives, or as the custom phar file format designed specifically for the phar extension. Each file format has advantages and disadvantages. The tar and zip file formats can be read or extracted by any third-party tool that can read the format, but require the phar extension in order to run with PHP. The phar file format is customized and unique to the phar extension, and can only be created by the phar extension or the PEAR package » PHP_Archive, but has the advantage that applications created in this format will run even if the phar extension is not enabled.
      More
      0
      0
      0
      0
      0
      0
      Post is under moderation
      Stream item published successfully. Item will now be visible on your stream.
    There are no activities here yet
    Unable to load tooltip content.