Arrow Research search

Author name cluster

Hangyu Lin

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

3 papers
2 author rows

Possible papers

3

IROS Conference 2025 Conference Paper

MIVG: Mode-Isolated Velocity-Guide Algorithm for Quadratic Optimization-Based Obstacle Avoidance

  • Hangyu Lin
  • Xiaoqi Chen
  • Songyin Cai
  • Xiangrui Lin
  • Kunpeng Wu

Dynamic obstacle avoidance is a challenging problem in robotic control, with many algorithms developed to balance efficiency and real-time performance. Existing resolved-rate motion control (RRMC) methods formulate obstacle avoidance as a quadratic programming (QP) problem. However, the lack of directional guidance for obstacle avoidance and frequent constraint conflicts often lead to execution failures. In this work, we propose the Mode-Isolated Velocity-Guide (MIVG) algorithm that deploys a dual-mode isolation strategy combined with a Velocity-Guide Potential Field (VGPF). This novel approach separates obstacle avoidance from target-driven tasks while providing velocity-based directional guidance. Simulations on a 7-degree-of-freedom Franka Emika Panda robot demonstrate that our approach significantly enhances task success rates while maintaining real-time feasibility, achieving an increase in execution success rates of 35. 0% ~ 52. 0% compared to the baseline RRMC strategy (NEO). Additionally, we analyze the impact of key parameters through simulations, further validating the effectiveness of the proposed algorithm in dynamic environments.

NeurIPS Conference 2025 Conference Paper

Rope to Nope and Back Again: A New Hybrid Attention Strategy

  • Bowen Yang
  • Bharat Venkitesh
  • Dwaraknath Gnaneshwar Talupuru
  • Hangyu Lin
  • David Cairuz
  • Phil Blunsom
  • Acyr Locatelli

Long-context large language models (LLMs) have achieved remarkable advancements, driven by techniques like Rotary Position Embedding (RoPE) (Su et al. , 2023) and its extensions (Chen et al. , 2023; Liu et al. , 2024c; Peng et al. , 2023). By adjusting RoPE parameters and incorporating training data with extended contexts, we can train performant models with considerably longer input sequences. However, existing RoPE-based methods exhibit performance limitations when applied to extended context lengths. This paper presents a comprehensive analysis of various attention mechanisms, including RoPE, No Positional Embedding (NoPE), and Query-Key Normalization (QK-Norm), identifying their strengths and shortcomings in long-context modeling. Our investigation identifies distinctive attention patterns in these methods and highlights their impact on long-context performance, providing valuable insights for architectural design. on long context performance, providing valuable insights for architectural design. Building on these findings, we propose a novel architecture featuring a hybrid attention mechanism that integrates global and local attention spans. This design not only surpasses conventional RoPE-based transformer models with full attention in both long and short context tasks but also delivers substantial efficiency gains during training and inference.

AAAI Conference 2016 Conference Paper

Verb Pattern: A Probabilistic Semantic Representation on Verbs

  • Wanyun Cui
  • Xiyou Zhou
  • Hangyu Lin
  • Yanghua Xiao
  • Haixun Wang
  • Seung-won Hwang
  • Wei Wang

Verbs are important in semantic understanding of natural language. Traditional verb representations, such as FrameNet, PropBank, VerbNet, focus on verbs’ roles. These roles are too coarse to represent verbs’ semantics. In this paper, we introduce verb patterns to represent verbs’ semantics, such that each pattern corresponds to a single semantic of the verb. First we analyze the principles for verb patterns: generality and specificity. Then we propose a nonparametric model based on description length. Experimental results prove the high effectiveness of verb patterns. We further apply verb patterns to context-aware conceptualization, to show that verb patterns are helpful in semantic-related tasks.