Title: Adaptive Offloading Inference for Delivering Application in Pervasive Computing Environments
1Adaptive Offloading Inference for Delivering
Application in Pervasive Computing Environments
2Outline
- Introduction ( 3 slides)
- Proposed approach ( 1 slide)
- System overview ( 3 slides)
- Design and algorithms ( 5 slides)
- Performance evaluation ( 5 slides)
- Conclusion (1 slide)
3Motivation
- It is challenging to deliver complex applications
on mobile devices - resource-constrained
- Limitations for existing approach
- Degrading an applications fidelity
- Adaptation efficiency is limited by
coarse-grained approaches - Expensive to rewrite an application
4Main idea of AIDE
- AIDE
- Adaptive Infrastructure for Distributed Execution
- A fine-grained runtime offloading system
- Main idea
- Dynamically partitioning the application during
runtime. - Offloading part of the application execution to a
powerful nearby surrogate device.
5Key problems
- When to trigger the offloading action?
- Which objects should be offloaded?
- OLIE solves above problems
6OLIE , the proposed approach
- Makes intelligent offloading decisions
- Timely triggering of adaptive offloading
- Intelligent selection of an application
partitioning policy - Using Fuzzy Control model
- Only focus on relieving the memory constraint
- Enables AIDE to deliver resource-intensive
applications with minimum overhead
7System overview
Triggering offloading action and making
offloading decisions.
Transforming method invocations to offloaded
objects into remote invocations.
8Program (Java) execution information
Class A Memory 5KB AccessFreq 10 Location
surrogate isNative false
InteractionFreq 12 BandwidthRequirement 1KB
9OLIE making offloading decisions
- Monitoring
- Tracking the amount of free space in the Java
heap, obtained from the JVM garbage collector - Bandwidth and delay are estimated by periodically
invoking the ping system utility.
- Making offloading decisions
- The new target memory utilization
- Classes to be offloaded
- Classes to be pulled back
10OLIE design and algorithms
- To relieve the memory constraint with minimum
overhead
- Remote function invocation delay
11Triggering of adaptive offloading
- Based on the Fuzzy Control model
- A generic fuzzy inference engine based on fuzzy
logic theory
- Typical Decision-making rule specifications, e.g.
- If(AvailMem is low) and (AvailBW is high)
- Then NewMemSize low
- If(AvailMem is low) and (AvailW is moderate)
- Then NewMemSize average
- Membership functions define the mappings between
the numerical value and the linguistic values
12Value mappings
AvaiMem belongs to the set of linguistic value
low is 100.
AvaiMem belongs to the linguistic value low is
linear decreasing function from 100 to 0.
AvaiMem belongs to the linguistic value either
low or moderate, but with different confidence
probabilities.
13Intelligent partitioning selection
- All nodes with isNativetrue are merged into one
node N to form the first partition set
- Examines each of the neighbors of N and selects
ones and merged into the first partition set
based on several policies,
- OLIE_MB (BandwithRequirment)
- Minimize the wireless network transmission load
- OLIE_ML (interactionFreq)
- Minimize the interaction delay
- OLIE_Combined (BandwithRequirment,
interactionFreq, memory) - Keep most active classes
- Have largest interactionFreq and
BandwithRequirment - Offload most inactive classes
- Have smallest interactionFreq and largest amount
of memory
14Decision-making algorithm
- Mi Memory size for Java class i
- EG N0, N1, ., Nn execution graph
- CMT the maximum memory size for a class node
- NewMemoryUtilization -1
-
while (offloading service is on)
while (no significant changes happen)
perform executions and update EG
accordingly while (? Mi gt CMT)
create a new node to represent class
i
//make the adaptive offloading triggering
decision SetLingVar() //set
numerical values for all input linguistic
variables fuzzify() //
map the numerical values to the linguistic
values FuzzyInferencEngine() //checking rules
and updating NewMemoryUtilization defuzzify()
// map the linguistic values
to the numerical values
if (NewMemoryUtilization -1) then
offloading is not triggered
else //make the partitioning decision
merge all non-offloadable classes into a node
N while (size(EG) gt 1)
merge (N, one of its neighbors NBj)
if ( current cut is better) bestPos NBj
Partitionmobiledevice N0, ,
NbestPos Partitionsurrogate NbestPost1, ,
Nn
15Performance evaluation
- Using extensive trace-driven simulations
-
-
- App. execution traces
- Executed on a Linux desktop machine
- By querying an Instrumented JVM
- Trace file records
- Method invocations
- Data field accesses
- Object creations and deletions
-
- Wireless network traces
- The Ping system utility on an IBM Thinkpad
- IEEE 802.11 WaveLAN network card
-
16Simulator
- Only consider average RTT for small packets
(about 2.4ms on average)
- Remote function invocation overhead, RTT/2
- Remote data access overhead, RTT
?Memory classes to be migrated current
available bandwidth
17Compared approaches
- Both use one simple fixed policy
- availMemory lt 5 totalMemory
newMemoryUtilization lt 80 totalMemory
- Random algorithm keeps randomly selected classes.
-
- LRU algorithm offloads least recently used
classes according to the AccessFreq of each class
18Experimental applications
19Results
20Conclusion
- Conclusion
- OLIE relieves memory constraints for mobile
devices with much lower overhead than other
common approaches
- Major contributions
- Identifying two key decision-making problems
- Applying the Fuzzy Control model to OLIE
- Proposing three policies for selecting
application partitions
21Question?