Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add FinMTEB #1267

Open
Muennighoff opened this issue Oct 1, 2024 · 3 comments · May be fixed by #1379
Open

Add FinMTEB #1267

Muennighoff opened this issue Oct 1, 2024 · 3 comments · May be fixed by #1379
Labels
good first issue Good for newcomers

Comments

@Muennighoff
Copy link
Contributor

Extremely nice benchmark that we should definitely integrate and add a leaderboard tab for: https://github.com/yixuantt/FinMTEB

It should be pretty easy to integrate as the code structure is very similar!

@KennethEnevoldsen KennethEnevoldsen added the good first issue Good for newcomers label Oct 3, 2024
@KennethEnevoldsen
Copy link
Contributor

@yixuantt might be interested in adding the integration?

@yixuantt
Copy link

yixuantt commented Oct 3, 2024

Thanks for the kind words! Unfortunately, I may not have enough bandwidth to finish it recently. (Maybe later if needed)

Here is a short description of the difference between these two benchmarks: almost all task-evaluating processes are similar except for summarization. The summarization is based on the text and summary instead of the "human summary vs the machine summary". For other tasks, simply copying the task dataset metadata is fine.

@KennethEnevoldsen
Copy link
Contributor

Thanks, @yixuantt great to have those details!

@alt-glitch alt-glitch linked a pull request Nov 4, 2024 that will close this issue
10 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants