Categories: FAANG

Analyzing the Effect of Linguistic Similarity on Cross-Lingual Transfer: Tasks and Input Representations Matter

Cross-lingual transfer is a popular approach to increase the amount of training data for NLP tasks in a low-resource context. However, the best strategy to decide which cross-lingual data to include is unclear. Prior research often focuses on a small set of languages from a few language families or a single task. It is still an open question how these findings extend to a wider variety of languages and tasks. In this work, we contribute to this question by analyzing cross-lingual transfer for 263 languages from a wide variety of language families. Moreover, we include three popular NLP tasks…
AI Generated Robotic Content

Recent Posts

SamsungCam UltraReal – Qwen-Image LoRA

Hey everyone, Just dropped the first version of a LoRA I've been working on: SamsungCam…

37 mins ago

40 Best Early Amazon Prime Day Deals on WIRED-Tested Gear (2025)

Amazon Prime Day is back, starting on October 7, but we’ve already found good deals…

2 hours ago

These little robots literally walk on water

HydroSpread, a breakthrough fabrication method, lets scientists build ultrathin soft robots directly on water. These…

2 hours ago

VHS filters work great with AI footage (WAN 2.2 + NTSC-RS)

submitted by /u/mtrx3 [link] [comments]

1 day ago

Algorithm Showdown: Logistic Regression vs. Random Forest vs. XGBoost on Imbalanced Data

Imbalanced datasets are a common challenge in machine learning.

1 day ago

Unlock global AI inference scalability using new global cross-Region inference on Amazon Bedrock with Anthropic’s Claude Sonnet 4.5

Organizations are increasingly integrating generative AI capabilities into their applications to enhance customer experiences, streamline…

1 day ago