Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Live actualisation of news feed #50

Open
Pyr0technicien opened this issue Oct 12, 2021 · 5 comments
Open

Live actualisation of news feed #50

Pyr0technicien opened this issue Oct 12, 2021 · 5 comments
Labels
question Further information is requested

Comments

@Pyr0technicien
Copy link

Pyr0technicien commented Oct 12, 2021

Hey,

is there anyway to update the feed of each tab dynamically as soon as the colector has finished to collect new datas ?

I would like to have an instance of taranis running live on a big screen in our office.

@markov2
Copy link
Contributor

markov2 commented Oct 13, 2021

In our (and probably also instances) the number of items to be processed is quite large... an automatic refresh may hinder their work.

Have you seen taranis4u, which is part of the distribution?

@markov2 markov2 added the question Further information is requested label Oct 13, 2021
@Pyr0technicien
Copy link
Author

Thanks for the reply, I have seen taranis4u but it doesn't seems it can help me with my question, it seems to me it is mostly statistics.

To avoid implementing a refresh GUI and interfering with the process of taranis, I decided I would export daily ( or so) the news generated by taranis, to be able to send them as a mailing list (html) to our customers. What would be the best approach for this ? Is there an API documentation wich would allow me to retrieve all news from one category easily or should I directly query the database ?

Thanks in advance

@markov2
Copy link
Contributor

markov2 commented Dec 6, 2021

Taranis4u uses a REST interface, which can easily be extended to collect and display other internal information. In our case, the collector detail information would be far too much to conveniently display on an office screen, hence only statistics are shown.

Your second paragraph is a different subject. Taranis does not produce "news", do you mean: advisories? Are those your own advisories or those produced by NCSC? You can collect various kinds of details from your Taranis instance via the REST interface,
see pm/Taranis/REST/Advisories.pm (I welcome your extensions ;-)

In our own extensions to the public Taranis, we also made a hook to upload new advisories to our website https://advisories.ncsc.nl However, that's more complex than you need.

@Pyr0technicien
Copy link
Author

Hey Markov,

Thanks again for your reply, this is indeed an othe question since we decided not to go with the live actualization.
I might be wrong here, but the collector collect different news from the sources we set up in the settings.

We would like to automatically export theses sources on a daily basis to provide our customer with the latest news related to their product / environnement, therefore before any processing on taranis, we just would like for now to retrieve the collected news.

@markov2
Copy link
Contributor

markov2 commented Dec 6, 2021

The collector collect news items from webpages by visiting them every 20 minutes, in our configuration in the order of thousands per day. Many websites publish about the same information: the whole process of analysis and advisories has been designed to create order in that mess. Do you really want to distribute that mess to other people?

From the mess of incoming news, you create Analysis which link groups of news items with normalization of product names and CVE codes. Only then, you know whether that info is useful for some customer. When you want to cover a large part of all security issues, you will probably produce 50+ analysis per day from hundreds of new CVEs per day.

NCSC itself limits the creating of analysis to products which are used by big organizations, which are then formalized in about 10 advisories per day. About 30 people are involved in that process.

IMHO, it is useless to distribute thousands of news items per day. It may be useful to distribute a few dozen analysis... but they need a lot of work to create.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants