论文标题

混合内部内部:组成和上下文敏感的静态分析框架

Hybrid Inlining: A Compositional and Context Sensitive Static Analysis Framework

论文作者

Liu, Jiangchao, Liu, Jierui, Di, Peng, Wu, Diyu, Zheng, Hengjie, Liu, Alex, Xue, Jingling

论文摘要

上下文敏感性对于在心理间静态分析中的精度至关重要。要(完全)上下文敏感的自上而下的分析需要完全嵌入每个呼唤的所有callees的陈述,从而导致陈述爆炸。构图分析嵌入了Callees的摘要,会扩大缩小,但通常会失去精确度,因为它不是严格的上下文敏感的。我们提出了一个构图和严格的上下文敏感框架,用于静态分析。该框架基于一个关键观察:组成静态分析通常仅在需要敏感的某些临界陈述上失去精度。我们的方法杂种将每个Callee的非临界陈述的批判性陈述和摘要融合在一起,从而避免了对非关键的陈述的重新分析。此外,我们懒惰的分析总结了关键陈述,一旦累积的呼叫上下文就足够,停​​止传播关键陈述。混合内部内部可以像上下文敏感的自上而下分析一样精确。我们已经基于此框架设计并实施了指针分析。它可以在几分钟内分析Dacapo基准套件和行业的大型Java计划。在我们的评估中,与上下文不敏感的分析相比,混合插入仅带来了Dacapo和工业应用的65%和1%的额外时间开销。

Context sensitivity is essential for achieving the precision in inter-procedural static analysis. To be (fully) context sensitive, top-down analysis needs to fully inline all statements of the callees at each callsite, leading to statement explosion. Compositional analysis, which inlines summaries of the callees, scales up but often loses precision, as it is not strictly context sensitive. We propose a compositional and strictly context sensitive framework for static analysis. This framework is based on one key observation: a compositional static analysis often loses precision only on some critical statements that need to be analyzed context sensitively. Our approach hybridly inlines the critical statements and the summaries of non-critical statements of each callee, thus avoiding the re-analysis of non-critical ones. In addition, our analysis lazily summarizes the critical statements, by stopping propagating the critical statements once the calling context accumulated is adequate. Hybrid Inlining can be as precise as context sensitive top-down analysis. We have designed and implemented a pointer analysis based on this framework. It can analyze large Java programs from the Dacapo benchmark suite and industry in minutes. In our evaluation, compared to context insensitive analysis, Hybrid Inlining just brings 65% and 1% additional time overhead on Dacapo and industrial applications respectively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源