문제

I have two servers for a website. One server will have php code and database and another less speed server to store files only. I need to implement these in a way that file uploaded through the website must store at another server and also then can be downloaded from there.

Can anyone suggest me best way to achieve that. I know the files can be transferred to another server by FTP functions of php just after uploading through website but doesn't seems a correct way.

Or two server can be used for static media content like images only.

Thanks

도움이 되었습니까?

해결책

The best idea is to just have ALL the files, including the websites files on the "storage server". Basically what you do is mount the "shared folder", this means the websites files and other files you will be needing. (Most of the times you just have a /var/www-local/ folder on the storage server, which you mount in /var/www/ on the webserver).

Make sure you mount it using NFS by adding it in the /etc/fstab file on the webserver. (More info on NFS)

The advantage of this idea is that when you want to expand, this is easily possible by putting a software loadbalancer (like HAProxy), adding as much webservers as you like and you will have your data in sync.

다른 팁

I've used something called Gluster which allows for things like this. The situation I use it for is more for load balancing than for alternate content distribution.

Also, sites like Stack Overflow use Content Distribution Network services for certain pieces of information on their site. A solution like this might actually be more cost effective than buying/setting up/maintaining a whole new server.

May be you can mount the upload directory to your web server. Check Linux NFS

Don't know if this is the best way, but why not simply have the file server a separate subdomain? That way it can handle all of the file download minutia and you can connect to it via FTP or SFTP from the main server.

Basically, here is a process you could use:

  1. Point your subdomain at the secondary server. I found some information on that here.
  2. Have an upload form on your main server which processes and validates the file.
  3. When the file is deemed acceptable, send it to the other server via FTP or SFTP. If you can't get the PHP tools working for this, phpseclib might help. You may want to make this step multi-threaded.

By using an url wrapper, you can use your default move_uploaded_file(), if your ftp server accepts this connection type. Alternatively, you can use PHP's ftp functions, especially ftp_put(), to upload the file to the server.

For content delivery, you need to have a database or other means to get the original url on the content distribution server and put the url in the html arguments:

<img src="http://cdn1.example.com/images/68263483.png" />
<a href="http://cdn2.example.com/files/9872345.pdf">Download PDF</a>

An example code to handle an uploaded files would be

<?php
// ...
$uploads_dir = 'images';
foreach ($_FILES["pictures"]["error"] as $key => $error) {
    if ($error == UPLOAD_ERR_OK) {
        $tmp_name = $_FILES["pictures"]["tmp_name"][$key];
        $name = $_FILES["pictures"]["name"][$key];
        move_uploaded_file($tmp_name, 
            "ftp://user:pass@cdn1.example.com/$uploads_dir/$name");

        // save url in your database for later retrieval ...
    }
}

or with ftp_put():

<?php
// ...
$ftpCon = ftp_connect('cdn1.example.com') 
    or die('Could not connect to ftp server');
$uploads_dir = 'images';
foreach ($_FILES["pictures"]["error"] as $key => $error) {
    if ($error == UPLOAD_ERR_OK) {
        $tmp_name = $_FILES["pictures"]["tmp_name"][$key];
        $name = $_FILES["pictures"]["name"][$key];
        ftp_put($ftpCon, "$uploads_dir/$name", $tmp_name)
            or die("could not upload $name to ftp server");
        // save url in your database for later retrieval ...
    }
}

if your application requires intensive read/write to the file server, then I would think it's a bad idea to separate it. maybe you can mirror the NFS to your main web server to reduce latency!

Use nginx to handle all request to static files (eg: css, images, javascript)

  1. Create a share on the file server to tbe directory where you want to download and upload files. Two shares if you plan to keep tyem in separate folders.

  2. Map the folders on the web server with the PHP code Base. Make sure you use a login to conect to the share(s) that will have read and write access.

  3. Once you have mapped the share(s), create a virtual directory on the website to the shared folder (this all depends on your web server).

  4. Check how PHP handles accessing files on a shared drive in a website, depending on your webserver it may be different than how you normally handle files.

I like this approach because the server OS handles transfering the files to and from the file server. PHP just handles telling it which files it needs.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top