Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: .tldrrc - support multiple repositories and local files/directories #299

Open
timonf opened this issue May 20, 2020 · 4 comments
Labels
enhancement help wanted Open for new contributors to work on

Comments

@timonf
Copy link

timonf commented May 20, 2020

Hi!

I like the idea of tldr! But it would be very nice to have an easier way to add Markdown files for custom commands.

I want to add tldr-pages (without replacing or forking existing pages) for custom commands (available to my team only).

Expected behavior

I can use multiple data sources in .tldrrc and can also add local directories (or at least local zip files):

{
  "repositories": [
    "http://tldr.sh/assets/tldr.zip",
    "~/Projects/build-tools-tldr"
  ]
}

Actual behaviour

I have to run:
zip -r tldr.zip . && npx serve .

I have to add repository to .tldrrc:

{
  "repository": "http://localhost:5000/tldr.zip"
}

…and I have no longer access to the repository at http://tldr.sh/assets/tldr.zip.
I could fork the tldr-repository… but only for adding some Markdown files?

Log, debug output

Scenario 1: Using local file

Putting local file into "repository" section:

.tldrrc:

{
  "repository": "/tmp/tldr.zip"
}

Output:

Page not found. Updating cache...Error: Invalid URI "/tmp/tldr.zip"

Scenario 2: Using working repository with custom files only

.tldrrc:

{
  "repository": "http://localhost:5000/tldr.zip"
}

Output:

$ tldr echo
✔ Page not found. Updating cache...
✔ Creating index...
Page not found.
If you want to contribute it, feel free to send a pull request to: https://github.com/tldr-pages/tldr

Environment

  • Directory structure for custom tldr pages is simple (pages/build-tools/do-something.md etc.)
@agnivade
Copy link
Member

agnivade commented May 25, 2020

It is a reasonable request. But since the pages are internal to your team, I am wondering how much effort is it to just periodically fetch the master pages and add your local set of pages. If there are multiple remote sources to merge from, that would be a valid request as keeping multiple sources in sync would be difficult.

There is also another nuance to deal with multiple remotes - file name collision. Within a single source, the filesystem guarantees that all files have to be unique. But when the client needs to merge from multiple sources, the spec needs to clearly define what happens when there are pages with the same name from different sources.

In short, properly implementing this would require a spec change. But given the cost-benefit ratios here, I am wondering if the sync and merge method may give the biggest bang for buck.

I would guess people already using private forks use the same method to keep their pages in sync.

@sbrl
Copy link
Member

sbrl commented May 27, 2020

Yeah, this would require a pretty major update to the tldr client spec - specifically in terms of the page resolution algorithm.

@timonf
Copy link
Author

timonf commented May 28, 2020

It is a reasonable request. But since the pages are internal to your team, I am wondering how much effort is it to just periodically fetch the master pages and add your local set of pages. If there are multiple remote sources to merge from, that would be a valid request as keeping multiple sources in sync would be difficult.

Private forks and uploading a zip to a non-public server makes it quite complex and uncomfortable.

There is also another nuance to deal with multiple remotes - file name collision. Within a single source, the filesystem guarantees that all files have to be unique. But when the client needs to merge from multiple sources, the spec needs to clearly define what happens when there are pages with the same name from different sources.

You could just use the order of the "repositories" array: Newer entries will overwrite previous entries.

@agnivade
Copy link
Member

I don't think you need to upload anything. One could just run a cron job in that server which periodically fetches the tldr master pages and merges with the local copy of pages. Of course, if you are updating your local copy, then you need to upload. But if you are behind a VPN, then it's as simple as an scp.

You could just use the order of the "repositories" array: Newer entries will overwrite previous entries.

Yes, I think that works. Open to anybody to update the spec and implement it. I am happy to review the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement help wanted Open for new contributors to work on
Projects
None yet
Development

No branches or pull requests

3 participants