Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server died cause of no more disk space (temp session in /tmp do not get removed) #158

Open
wants to merge 9 commits into
base: master
Choose a base branch
from

Conversation

droletmarc
Copy link

  • I've added code to remove the temporary session file that get create
    on each rets request. This file is not used anymore once we get the
    gets server response and it didn't get removed.

  • /tmp folder was hardcoded, I've added a way to configure it into the
    configuration of the PHRets. By default it will be null.

droletmarc added 4 commits July 14, 2017 08:24
  on each rets request.  This file is not used anymore once we get the
  gets server response and it didn't get removed.

- /tmp folder was hardcoded, I've added a way to configure it into the
  configuration of the PHRets.  By default it will be null.
not work if you pass a value (basic or digest) into the
http_authentication.
I've modify the code to be able to pass http_authentication = null when
we set the config.
not work if you pass a value (basic or digest) into the
http_authentication.
I've modify the code to be able to pass http_authentication = null when
we set the config.
@steveheinsch
Copy link

I think by default, it should be /tmp like it was, with the ability to change it if you wish. Otherwise you could break a lot of existing code running on different server setups. I do like the idea of being able to specify the directory IF you don't want to use the /tmp default dir.

@fedeisas
Copy link
Contributor

I had a server die because of this. I'm willing to pick up this PR and make the changes needed to get it into master. @troydavisson

@troydavisson
Copy link
Owner

I'd happily accept some feedback and other ideas for how to address this.

A couple of issues that come to mind (some related to the code changes mentioned here, or just concerns generally):

  • Many users (myself included) have multiple, separate processes which require RETS connections (and, by extension, cookie files for CURL), and those are often times running at the same time. Going with an approach that clears/cleans an entire directory at a certain point has the potential to break other processes actively running
  • At a smaller level, the solution would also need to be aware of why the cookie file is used, even for the single process that uses it. The code change proposed here looks to delete the cookie file after every request which negates the whole reason for it (to persist session details between requests, since these are stateful connections)
  • CURL is fairly limited in what it supports for cookie jars, but maybe some changes have been made recently that I'm unaware of
  • I've explored not using CURL cookie handling at all and instead handling things with Guzzle middleware, but CURL does some automatic handling of certain responses that doesn't work correctly with Guzzle and so I had to back down from this approach

I'm happy to discuss these on Slack with anyone interested in brainstorming a solution.

@fedeisas
Copy link
Contributor

@troydavisson I think the current code is using a different cookie for each request.

@jeb218
Copy link

jeb218 commented Apr 16, 2018

My server also crashed yesterday and today because of this. My /tmp directory had 2.3 million of these phrets files built up from the last 6 months or so. So these files are never removed. I deleted all of them this morning, and I checked just now, and already have 42,000 of them back in the directory.

Some kind of solution is needed here, and I greatly appreciate your time on this project. It has been an excellent resource for my company.

Edit: I also thought it might be worth noting that all of the files (at least on my server) are empty. No session data is stored in them. They are all 0 byte files. The reason space runs out isn't actually due to storage capacity, but rather due to inode capacity. Each file has an inode entry that contains metadata about the file, so with these files never being removed, inode space is being maxed out.

@DeskAgent
Copy link

@fedeisas You are correct. Every request creates a new cookie with a unique name via tempnam. I had to get this fixed quickly so I added an unlink right before the request return:

Session.php:428

unlink($options['curl'][CURLOPT_COOKIEFILE]);

@fedeisas
Copy link
Contributor

fedeisas commented May 25, 2018

@DeskAgent I extended the Session class to override the getDefaultOptions method and change the options to:

$options['curl'][CURLOPT_COOKIEFILE] = '';

I think this keep cookies in memory.

@maietta
Copy link

maietta commented Jul 17, 2018

This was a tough one to diagnose but I awoke a frantic cPanel VPS bawking about no disk space. Finally, i traced down an issue with /tmp being filled to the gills with phrets* files. Millions of them.

The ls command took a very long time and displayed nothing. I was unable to use:
$> find -name /tmp 'phrets*' -exec rm {} ;

No matter what I tried, i kept getting errors or issues. So here's my final solution which should help others out if they need a fix NOW and without modifying the PHRets library or waiting for fixes that may never come.

To mitigate this problem you will need a shell with wheel privileges:

$> cd /tmp && ls > list.txt

Create a PHP file:. Call it "fix.php". Doesn't have to be PHP, but PHP is convenient and easier than shell script for most people.

<?php $file = fopen("/tmp/list.txt", "r"); unlink("/root/list.txt"); while(!feof($file)) { $name = fgets($file); if ( substr( $name, 0, 6 ) === "phrets" ) { file_put_contents("/tmp/delete.txt", "/tmp/".$name, FILE_APPEND); }
$> php fix.php

After a potentially very long wait, you will have a delete.txt file.

Now run:
$> xargs rm < delete.txt

NOTE: If you want to keep the stock software unpatched library, then consider using the unix/linux scheduler for cleanup.

$> crontab -e

0 /6 * * * find -name /tmp 'phrets' -exec rm {} ; >/dev/null 2>&1

droletmarc and others added 5 commits August 15, 2018 15:46
    SimpleXmlElement return an error when the content of the string is
not valid UTF-8 chars.  I've make sure the content is UTF-8. other chars
get discarded
Bring Master project code changes to my fork
@maietta
Copy link

maietta commented Mar 20, 2019

@DeskAgent I extended the Session class to override the getDefaultOptions method and change the options to:

$options['curl'][CURLOPT_COOKIEFILE] = '';

I think this keep cookies in memory.

Could you provide a sample how you "extended" the Session and override the getDefaultOptions method?

something like

class MySession extends Session {
function getDefaultOptions(){ // Pass in param(s)?

// What else would be needed?
$options['curl'][CURLOPT_COOKIEFILE] = '';
return $options;
}
}

This session problem still persists so i'm looking at fixing with other options provided by others. My method is not a great option, but it solves the problem temporarily. Thanks.

@techstroke
Copy link

techstroke commented Apr 10, 2019

I had a similar case but we had around 20.5 million of these files in /tmp/ directory files, and even ls in /tmp wont work but then I found ways to list all files and then eventually remove them

so, ls -1f worked as it doesn't check file attributes/metadata and then I saw it was phrets* files

after this I removed files using this ls -1f | xargs -Ifile rm -v file while inside the /tmp

and then finally added a cron job in my root user using

crontab -e
0 2 * * * find /tmp/ -name "phrets*" -mtime +2 -type f -delete >> /dev/null 2>&1

this would delete these phrets* files older than 2 days every day around 2AM , hope it helps other dealing with the same issue!

@troydavisson
Copy link
Owner

For those interested and able to test, I've pushed a branch which includes the proposed fix.

https://packagist.org/packages/troydavisson/phrets#dev-cookie-files

211172a

Please test that within your systems and see if that resolves your issue.

@moazam1
Copy link

moazam1 commented Feb 22, 2020

@troydavisson thank you. Can you merge with main repo now?

@SteveMullen
Copy link

SteveMullen commented Apr 11, 2020

@troydavisson - the "All checks have passed" and "This branch has no conflicts..." messages lead me to believe that this fix has been merged. I've done a Composer Update, and am at 2.6.2 ( user agent string). I'm still seeing massive numbers of tmp files. Can you point me in the right direction?

Thank you

@maietta
Copy link

maietta commented Apr 11, 2020

Yum update is for CentOS or Redhat based Linux distributions. You sought updating the operating system would update the software? That's not how this works. Please consider how the software was actually installed and look at what it would take to get it updated. What I would do is locate where this library is installed, take a snapshot of everything, then perhaps a composer update if you have a composer.json or.lock file in place.. but double check the contents of that composer.json file to make sure that it is reflecting that you've got phrets installed. If so it means you can run a composer update. Verify if everything is working.

@SteveMullen
Copy link

SteveMullen commented Apr 11, 2020

Yum update is for CentOS or Redhat based Linux distributions. You sought updating the operating system would update the software? That's not how this works. Please consider how the software was actually installed and look at what it would take to get it updated. What I would do is locate where this library is installed, take a snapshot of everything, then perhaps a composer update if you have a composer.json or.lock file in place.. but double check the contents of that composer.json file to make sure that it is reflecting that you've got phrets installed. If so it means you can run a composer update. Verify if everything is working.

I am running CentOS. PHRets was installed via Composer. I did a 'composer update'.

@maietta
Copy link

maietta commented Apr 26, 2020

Yum update is for CentOS or Redhat based Linux distributions. You sought updating the operating system would update the software? That's not how this works. Please consider how the software was actually installed and look at what it would take to get it updated. What I would do is locate where this library is installed, take a snapshot of everything, then perhaps a composer update if you have a composer.json or.lock file in place.. but double check the contents of that composer.json file to make sure that it is reflecting that you've got phrets installed. If so it means you can run a composer update. Verify if everything is working.

I am running CentOS. PHRets was installed via Composer. I did a 'composer update'.

I don't know why i didn't get notification for your response. If you installed via composer, then you need to inspect the composer file and make sure it's not locked to a specific version of phrets you no longer desire to use. Composer references the version you are after, then run composer update. (After backing up your files, of course).

@moazam1
Copy link

moazam1 commented Jul 31, 2020

@troydavisson really appreciate if you can merge this now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

10 participants