Adaptive Offloading Inference for Delivering Application in Pervasive Computing Environments - PowerPoint PPT Presentation

About This Presentation
Title:

Adaptive Offloading Inference for Delivering Application in Pervasive Computing Environments

Description:

Design and algorithms ( 5 s) Performance evaluation ( 5 s) Conclusion (1 ) ... LRU algorithm offloads least recently used classes according to ... – PowerPoint PPT presentation

Number of Views:77
Avg rating:3.0/5.0
Slides: 22
Provided by: chiut
Category:

less

Transcript and Presenter's Notes

Title: Adaptive Offloading Inference for Delivering Application in Pervasive Computing Environments


1
Adaptive Offloading Inference for Delivering
Application in Pervasive Computing Environments
  • Presented by Jinhui Qin

2
Outline
  • Introduction ( 3 slides)
  • Proposed approach ( 1 slide)
  • System overview ( 3 slides)
  • Design and algorithms ( 5 slides)
  • Performance evaluation ( 5 slides)
  • Conclusion (1 slide)

3
Motivation
  • It is challenging to deliver complex applications
    on mobile devices
  • resource-constrained
  • Limitations for existing approach
  • Degrading an applications fidelity
  • Adaptation efficiency is limited by
    coarse-grained approaches
  • Expensive to rewrite an application

4
Main idea of AIDE
  • AIDE
  • Adaptive Infrastructure for Distributed Execution
  • A fine-grained runtime offloading system
  • Main idea
  • Dynamically partitioning the application during
    runtime.
  • Offloading part of the application execution to a
    powerful nearby surrogate device.

5
Key problems
  • When to trigger the offloading action?
  • Which objects should be offloaded?
  • OLIE solves above problems

6
OLIE , the proposed approach
  • Makes intelligent offloading decisions
  • Timely triggering of adaptive offloading
  • Intelligent selection of an application
    partitioning policy
  • Using Fuzzy Control model
  • Only focus on relieving the memory constraint
  • Enables AIDE to deliver resource-intensive
    applications with minimum overhead

7
System overview
Triggering offloading action and making
offloading decisions.
Transforming method invocations to offloaded
objects into remote invocations.
8
Program (Java) execution information
Class A Memory 5KB AccessFreq 10 Location
surrogate isNative false
InteractionFreq 12 BandwidthRequirement 1KB
9
OLIE making offloading decisions
  • Monitoring
  • Tracking the amount of free space in the Java
    heap, obtained from the JVM garbage collector
  • Bandwidth and delay are estimated by periodically
    invoking the ping system utility.
  • Making offloading decisions
  • The new target memory utilization
  • Classes to be offloaded
  • Classes to be pulled back

10
OLIE design and algorithms
  • Goal
  • To relieve the memory constraint with minimum
    overhead
  • Migration cost
  • Remote data access delay
  • Remote function invocation delay

11
Triggering of adaptive offloading
  • Based on the Fuzzy Control model
  • A generic fuzzy inference engine based on fuzzy
    logic theory
  • Typical Decision-making rule specifications, e.g.
  • If(AvailMem is low) and (AvailBW is high)
  • Then NewMemSize low
  • If(AvailMem is low) and (AvailW is moderate)
  • Then NewMemSize average
  • Membership functions define the mappings between
    the numerical value and the linguistic values

12
Value mappings
AvaiMem belongs to the set of linguistic value
low is 100.
AvaiMem belongs to the linguistic value low is
linear decreasing function from 100 to 0.
AvaiMem belongs to the linguistic value either
low or moderate, but with different confidence
probabilities.
13
Intelligent partitioning selection
  • The coalescing process
  • All nodes with isNativetrue are merged into one
    node N to form the first partition set
  • Examines each of the neighbors of N and selects
    ones and merged into the first partition set
    based on several policies,
  • OLIE_MB (BandwithRequirment)
  • Minimize the wireless network transmission load
  • OLIE_ML (interactionFreq)
  • Minimize the interaction delay
  • OLIE_Combined (BandwithRequirment,
    interactionFreq, memory)
  • Keep most active classes
  • Have largest interactionFreq and
    BandwithRequirment
  • Offload most inactive classes
  • Have smallest interactionFreq and largest amount
    of memory

14
Decision-making algorithm
  • Mi Memory size for Java class i
  • EG N0, N1, ., Nn execution graph
  • CMT the maximum memory size for a class node
  • NewMemoryUtilization -1

while (offloading service is on)
while (no significant changes happen)
perform executions and update EG
accordingly while (? Mi gt CMT)
create a new node to represent class
i
//make the adaptive offloading triggering
decision SetLingVar() //set
numerical values for all input linguistic
variables fuzzify() //
map the numerical values to the linguistic
values FuzzyInferencEngine() //checking rules
and updating NewMemoryUtilization defuzzify()
// map the linguistic values
to the numerical values
if (NewMemoryUtilization -1) then
offloading is not triggered
else //make the partitioning decision
merge all non-offloadable classes into a node
N while (size(EG) gt 1)
merge (N, one of its neighbors NBj)
if ( current cut is better) bestPos NBj
Partitionmobiledevice N0, ,
NbestPos Partitionsurrogate NbestPost1, ,
Nn
15
Performance evaluation
  • Using extensive trace-driven simulations
  • App. execution traces
  • Executed on a Linux desktop machine
  • By querying an Instrumented JVM
  • Trace file records
  • Method invocations
  • Data field accesses
  • Object creations and deletions
  • Wireless network traces
  • The Ping system utility on an IBM Thinkpad
  • IEEE 802.11 WaveLAN network card

16
Simulator
  • Only consider average RTT for small packets
    (about 2.4ms on average)
  • Remote function invocation overhead, RTT/2
  • Remote data access overhead, RTT
  • Migration overhead,

?Memory classes to be migrated current
available bandwidth
17
Compared approaches
  • Random and LRU
  • Both use one simple fixed policy
  • availMemory lt 5 totalMemory
    newMemoryUtilization lt 80 totalMemory
  • Random algorithm keeps randomly selected classes.
  • LRU algorithm offloads least recently used
    classes according to the AccessFreq of each class

18
Experimental applications
19
Results
20
Conclusion
  • Conclusion
  • OLIE relieves memory constraints for mobile
    devices with much lower overhead than other
    common approaches
  • Major contributions
  • Identifying two key decision-making problems
  • Applying the Fuzzy Control model to OLIE
  • Proposing three policies for selecting
    application partitions

21
Question?
Write a Comment
User Comments (0)
About PowerShow.com