Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Can I diagnose abnormal jobs directly from the yarn spark app without collecting scheduling metadata? #220

Open
liangrui1988 opened this issue Apr 18, 2024 · 4 comments
Labels
question Further information is requested

Comments

@liangrui1988
Copy link

Contact Details

liangrui@yy.com

What would you like to ask or discuss?

你好!请问,是否可以不依懒调度系统元数据,直接从yarn和spark元数据和日志进行自动触发作业诊断呢?
在我没有采集调度元数据的情况下,配置了spark和yarn的采集,并且能够实现离线作业的诊断,根据作业id来诊断。
但从compass页面上看不到任何作业信息,应该是compass依懒调度元数据触发自动诊断是吧。
但我们调度系统是自已开发的,目前不想集成进来,先直接诊断yarn spark作业。
这种情况是否可以,从那里触发自动诊断呢?

@liangrui1988 liangrui1988 added the question Further information is requested label Apr 18, 2024
@nilnon
Copy link
Collaborator

nilnon commented Apr 18, 2024

@liangrui1988 You can refer to the document

@liangrui1988
Copy link
Author

liangrui1988 commented Apr 19, 2024

@nilnon

@liangrui1988 你可以参考文档

我的意思不是对一个作业进行离线诊断,而是对所有yarn上的作业进行自动诊断哈,然后再web页面上可以看到每天诊断出异常的作业信息。
这个需要怎么调整或配置?

@meijing123
Copy link

It is feasible to implement a synchronization module that pulls down completed YARN jobs and invokes diagnostic interfaces, making some customizations on top of the source code.

@liangrui1988
Copy link
Author

It is feasible to implement a synchronization module that pulls down completed YARN jobs and invokes diagnostic interfaces, making some customizations on top of the source code.

Ok, thank you. Let me see. How do we do that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants