<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Serverless | Sen He</title><link>https://senhe.ai/tags/serverless/</link><atom:link href="https://senhe.ai/tags/serverless/index.xml" rel="self" type="application/rss+xml"/><description>Serverless</description><generator>HugoBlox Kit (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Sun, 01 Jan 2023 00:00:00 +0000</lastBuildDate><item><title>Performance Testing for Cloud and Serverless Computing</title><link>https://senhe.ai/projects/cloud-testing/</link><pubDate>Sun, 01 Jan 2023 00:00:00 +0000</pubDate><guid>https://senhe.ai/projects/cloud-testing/</guid><description>&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;Cloud computing introduces fundamental challenges for performance testing due to resource contention, hidden scheduling policies, and passive auto-scaling. These challenges are amplified in &lt;strong&gt;serverless computing&lt;/strong&gt; environments where the resource abstraction level is higher and auto-scaling behaviors are not well-characterized.&lt;/p&gt;
&lt;h2 id="research-thrusts"&gt;Research Thrusts&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;Serverless Auto-Scaling Characterization&lt;/strong&gt;: We define and characterize auto-scaling stages for serverless platforms, decomposing performance uncertainty into resource contention during execution and cold start-up latencies during environment initiation.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Monte Carlo Simulation-Based Testing&lt;/strong&gt;: We develop simulation-based methodologies that can predict performance distributions of serverless applications, accounting for the stochastic nature of cloud performance fluctuations.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;AI-Based Cloud Emulation&lt;/strong&gt;: Building systems that emulate cloud environments on local machines to help users obtain accurate performance results at reduced testing costs, particularly when multiple applications need to be evaluated.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Performance Assurance in DevOps&lt;/strong&gt;: Using development and operational data to detect performance regressions early in the software delivery cycle, with attention to end-user impact.&lt;/p&gt;</description></item></channel></rss>