Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid setting base url in config, instead automatically construct base url from request #668

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

Haocen
Copy link
Contributor

@Haocen Haocen commented Jul 6, 2020

This PR fixes
close #651

Changes

  • Avoid setting base url in config, instead automatically construct base url from request

ToDo

  • [ ]
  • [ ]
  • [ ]

Why do it manually when we can automatically detect it from code.
A small change to relax the burden of server admin a little bit.

@rugk
Copy link
Member

rugk commented Jul 6, 2020

@ZerooCool If you want, feel free to review this using GitHub's review tool (see "files changed" tab).

@elrido
Copy link
Contributor

elrido commented Jul 6, 2020

The unit tests fail, because they don't setup the $_SERVER super globals with all of those keys - these depend on the webserver to set them for PHP. Apache and nginx behave very different for these, some are set in one but not the other, etc. I would suggest to replace the isset($_SERVER['XYZ']) with the array_key_exists('XYZ', $_SERVER) construct, as those won't access non-existing keys and pass the tests.

Maybe we can add and re-use the logic that is partially already there in the Request class:

PrivateBin/lib/Request.php

Lines 196 to 221 in 33bcce5

/**
* Get host as requested by the client
*
* @access public
* @return string
*/
public function getHost()
{
return array_key_exists('HTTP_HOST', $_SERVER) ?
htmlspecialchars($_SERVER['HTTP_HOST']) :
'localhost';
}
/**
* Get request URI
*
* @access public
* @return string
*/
public function getRequestUri()
{
return array_key_exists('REQUEST_URI', $_SERVER) ?
htmlspecialchars(
parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH)
) : '/';
}

@ZerooCool had also mentioned the use case where the instance isn't located directly under the webroot, you might want to add the REQUEST_URI to the path.

BTW, this is what we currently are using these methods for:

$page->assign('HTTPSLINK', 'https://' . $this->_request->getHost() . $this->_request->getRequestUri());

We could streamline that latter method call by using the already stored $this->_urlBase:

$this->_urlBase = $this->_request->getRequestUri();

@Haocen
Copy link
Contributor Author

Haocen commented Jul 6, 2020

Sounds great, please bear with me being a PHP novice, I'll take a closer look at this.

@rugk rugk mentioned this pull request Aug 11, 2020
@ZerooCool
Copy link
Contributor

1-
Need add this, in html please
<html>
by
<html prefix="og: https://ogp.me/ns#" >

2-
I see, if robots.txt deny all robots, then, the card validator from twitter can't work, then, and then Opengraph doesn't work too !
https://cards-dev.twitter.com/validator
It's really a problem from Twitter !

In robots.txt, we could propose to include an authorization for the twitter crawler, commented by default, to propose to uncomment, to allow the operation of the Opengraph from Twitter.

3-
It's better to use image 200 =< for facebook :

<meta property="og:image" content="REAL PATH FROM IMAGE" />
<meta property="og:image:type" content="image/png" />
<meta property="og:image:width" content="200" />
<meta property="og:image:height" content="200" />

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

It would be better to place all the entries coming from the META tags, with an absolute URL.
4 participants