Llama - Who's Using It and Any Good Examples?

calisro

Member
Apr 7, 2011
6
0
0
Visit site
I didn't want to compile rsync myself so I grabbed this APK (https://code.google.com/p/rsyncdroid/downloads/list), downloaded it, renamed it to rsyncdroid-0.4.zip. If you open it in with any of your zip programs, you can pull the rsync executable from the apk. I put that rsync in /data/local/bin (where I install all my binaries and 'chmod +x'. I use that location since you cant use the sdcard to execute binaries directly in shell). I then exchanged keys with my nas and put the resulting key in /sdcard/.ssh/ and set permissions as required by a key exchange.

Now it is just a matter of running normal rsync like anywhere else. I execute it with SL4A shell because with that localle plugin you can have it execute in the background. Remember this needs root so you'll need to execute rsync like this in the shell script:

rsync.sh script:
/system/xbin/su -c "/data/local/bin/rsync -e \"ssh -i /sdcard/.ssh/id_rsa\" -thlvr --delete /sdcard/ root@nas:/some/path/sdcard/"

I run cyanogenmod so it has the right ssh for this to work. Im sure you can do the same with busybox but I cant test that.
/system

My Llama event is simple. When a variable for night = yes and time is between 2am and 6am, run localle plugin SL4A:/rsync.sh.
 

calisro

Member
Apr 7, 2011
6
0
0
Visit site
To anyone that is having issues with location awareness, these work very well for me. It does use wifi polling when Im close to the home or at home (just incase I get a temporary disconnect) and turns that off when im away. Yopu can import these and take a look if you want. You'll notice I use variables extensively. It is much simpler to keep variables and use other events to trigger off the variables. I don't set anything really in the 'location' events except hte varaibles. My profiles are set in other events and trigger from the variables. I can then reuse these variables for other things (like rsync for example) without rewriting the event.

These are URLs made using the 'llama share' functionality. It will allow you to import them into your llama by clicking. :)

Location Events:
http://llama.location.profiles/1+-+At+home/1+%5Cd+At+home%7C0-1-0-0-1-0-0-0-1-0-Location%2FWifi+mgmt-30-%7C%3A%7Ce%7CHome+wifi%7Cqe%7C1a+%5Cbd+Wifi+Polling+on+home%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdLocation%2FWifi+mgmt%5Cd0%5Cd%5Cp%3A%5Cplw%5Cp5%5Cp%7C500000%7Cvs%7Cloc%7Chome%7Cw%7C1-0%7C/2+-+Near+home/2+%5Cd+Near+home%7C0-1-0-0-1-0-0-0-1-0-Location%2FWifi+mgmt-30-%7C%3A%7Ce%7CHome+Area%7Cni%7CHome+wifi%7Cvs%7Cloc%7Caway%7Cw%7C1-5%7Cqe%7C2a+%5Cbd+wifi+polling+on+near+home%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdLocation%2FWifi+mgmt%5Cd0%5Cd%5Cp%3A%5Cplw%5Cp5%5Cp%7C500000%7C/3+-+At+work/3+%5Cd+At+work%7C0-1-0-0-1-0-0-0-1-0-Location%2FWifi+mgmt-30-%7C%3A%7Ce%7CWork+wifi%7Cvs%7Cloc%7Cwork%7Cw%7C1-0%7Cqe%7C3a+%5Cbd+wifi+polling+on+work%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdLocation%2FWifi+mgmt%5Cd0%5Cd%5Cp%3A%5Cplw%5Cp5%5Cp%7C3000000%7C<200b><200b>/4+-+Near+work/4+%5Cd+Near+work%7C0-1-0-0-1-0-0-0-1-0-Location%2FWifi+mgmt-30-%7C%3A%7Ce%7CWork+Area%7Cni%7CWork+wifi%7Cvs%7Cloc%7Caway%7Cw%7C2-0%7Cqe%7C4a+%5Cbd+wifi+polling+on+near+work%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdLocation%2FWifi+mgmt%5Cd0%5Cd%5Cp%3A%5Cplw%5Cp5%5Cp%7C500000%7C/5+-+Near+other+known+areas/5+%5Cd+Near+other+known+areas%7C0-1-0-0-1-0-0-0-1-0-Location%2FWifi+mgmt-30-%7C%3A%7Ce%7CMom+and+Dads%5CpPrice%5CpRalph%5CpSusan%7Cni%7CHome+Area%5CpHome+wifi%5CpWork+Area%5CpWork+wifi%7Cvs%7Cloc%7Caway%7Cw%7C1-5%7Cqe%7C5a+%5Cbd+wifi+polling+on+others%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdLocation%2FWifi+mgmt%5Cd0%5Cd%5Cp%3A%5Cplw%5Cp5%5Cp%7C3000000%7C/9+-+Away+from+known+areas/9+%5Cd+Away+from+known+areas%7C0-1-0-0-1-0-0-0-1-0-Location%2FWifi+mgmt-30-%7C%3A%7Cni%7CHome+Area%5CpHome+wifi%5CpMom+and+Dads%5CpPrice%5CpRalph%5CpSusan%5CpWork+Area%5CpWork+wifi%7Cvs%7Cloc%7Caway%7Cw%7C2-0%7Cqe%7C9a+%5Cbd+wifi+polling+off%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdLocation%2FWifi+mgmt%5Cd0%5Cd%5Cp%3A%5Cplw%5Cp2147483647%5Cp%7C500000%7C

Profile adjustment events:
http://llama.location.profiles/1+-+Home/1+%5Cd+Home%7C0-1-0-0-0-0-0-0-1-0-Profile+mgmt-0-%7C%3A%7Cvc%7C1%7Cloc%7Chome%7Cvc%7C1%7Cnight%7C%7Ct%7C450%7C1350%7Cp2%7CHome%7C0%7C/2+-+Night/2+%5Cd+Night%7C0-1-0-0-0-0-0-0-1-0-Profile+mgmt-0-%7C%3A%7Ct%7C1350%7C390%7Cvc%7C1%7Cloc%7Chome%7Cp2%7CNight%7C0%7Ci%7C40%7Csr%7C0%7Cqe%7C2a+%5Cbd+Night+turn+off%5Cp0%5Cd1%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd0%5Cd1%5Cd0%5CdProfile+mgmt%5Cd0%5Cd%5Cp%3A%5Cpt%5Cp390%5Cp1380%5Cpc%5Cp0%5Cprs%5CpHome%5CpFgAAAGEAbgBkAHIAbwBpAGQALgBjAG8AbgB0AGUAbgB0AC4ASQBuAHQAZQBuAHQAAAAAABsAAABjAG8AbQAuAGsAZQBiAGEAYgAuAEwAbABhAG0AYQAuAFIAdQBuAFMAaABvAHIAdABjAHUAdAAAAAAAAAD%2F%2F%2F%2F%2FAACAQP%2F%2F%2F%2F8PAAAAYwBvAG0ALgBrAGUAYgBhAGIALgBMAGwAYQBtAGEAAAAuAAAAYwBvAG0ALgBrAGUAYgBhAGIALgBMAGwAYQBtAGEALgBMAGEAdQBuAGMAaABlAHIAUwBoAG8AcgB0AGMAdQB0AFIAdQBuAG4AZQByAEEAYwB0AGkAdgBpAHQAeQAAAAAAAAAAAAAAAAAAAAAAAAAAAGQAAABCTkRMAgAAAAAAAAAJAAAATABsAGEAbQBhAEQAYQB0AGEAAAAAAAAABAAAAEgAbwBtAGUAAAAAAAAAAAAJAAAATABsAGEAbQBhAFQAeQBwAGUAAAAAAAAABQAAAEUAdgBlAG4AdAAAAA%3D%3D%5Cpsr%5Cp%5Cd1%5Cpi%5Cp80%5Cpvs%5Cpnight%5Cp%5Cp%7C0%7Cvs%7Cnight%7Cyes%7C/3+-+Away/3+%5Cd+Away%7C0-1-0-1-1-0-0-0-1-0-Profile+mgmt-0-%7C%3A%7Cvc%7C1%7Cloc%7Caway%7Cp2%7CAway%7C0%7Cd%7C1%7C/4+-+Work/4+%5Cd+Work%7C0-1-0-0-0-0-0-0-1-0-Profile+mgmt-0-%7C%3A%7Cvc%7C1%7Cloc%7Cwork%7Cp2%7CQuiet%7C0%7C

For those of you whom are having trouble using the URLs to import them directly into Llama, here is hte english version:

Event name: 1 - At home
Event group (optional): Location/Wifi mgmt

Advanced...:
Delay event
Delay seconds: 30 seconds

Conditions (match all):
Enter Area: Home wifi

Actions:
Queue another event >
Event name: 1a - Wifi Polling on home
Queue delay seconds: 5

Actions:
Llama WiFi Polling: 5 minutes

Llama variable >
Variable name: loc
Variable value : home
Toggle WiFi: WiFi On




Event name: 2 - Near home
Event group (optional): Location/Wifi mgmt

Advanced...:
Delay event
Delay seconds: 30 seconds

Conditions (match all):
Enter Area: Home Area
Not in Area: Home wifi

Actions:
Llama variable >
Variable name: loc
Variable value : away
Toggle WiFi: WiFi On for at least 5 minutes
Queue another event >
Event name: 2a - wifi polling on near home
Queue delay seconds: 5

Actions:
Llama WiFi Polling: 5 minutes



Event name: 3 - At work
Event group (optional): Location/Wifi mgmt

Advanced...:
Delay event
Delay seconds: 30 seconds

Conditions (match all):
Enter Area: Work wifi

Actions:
Llama variable >
Variable name: loc
Variable value : work
Toggle WiFi: WiFi On
Queue another event >
Event name: 3a - wifi polling on work
Queue delay seconds: 30

Actions:
Llama WiFi Polling: 5 minutes

Event name: 4 - Near work
Event group (optional): Location/Wifi mgmt

Advanced...:
Delay event
Delay seconds: 30 seconds

Conditions (match all):
Enter Area: Work Area
Not in Area: Work wifi

Actions:
Llama variable >
Variable name: loc
Variable value : away
Toggle WiFi: WiFi Off(if not connected)
Queue another event >
Event name: 4a - wifi polling on near work
Queue delay seconds: 5

Actions:
Llama WiFi Polling: 5 minutes



Event name: 5 - Near other known areas
Event group (optional): Location/Wifi mgmt

Advanced...:
Delay event
Delay seconds: 30 seconds

Conditions (match all):
Enter Area: Mom and Dads, Price|Ralph|Susan
Not in Area: Home Area, Home wifi|Work Area|Work wifi

Actions:
Llama variable >
Variable name: loc
Variable value : away
Toggle WiFi: WiFi On for at least 5 minutes
Queue another event >
Event name: 5a - wifi polling on others
Queue delay seconds: 30

Actions:
Llama WiFi Polling: 5 minutes



Event name: 9 - Away from known areas
Event group (optional): Location/Wifi mgmt

Advanced...:
Delay event
Delay seconds: 30 seconds

Conditions (match all):
Not in Area: Home Area, Home wifi|Mom and Dads|Price|Ralph|Susan|Work Area|Work wifi

Actions:
Llama variable >
Variable name: loc
Variable value : away
Toggle WiFi: WiFi Off(if not connected)
Queue another event >
Event name: 9a - wifi polling off
Queue delay seconds: 5

Actions:
Llama WiFi Polling: Never




Event name: 1 - Home
Event group (optional): Profile mgmt

Conditions (match all):
Llama variable >
Variable name: loc
Current value
Variable value : home
Llama variable >
Variable name: night
Current value
Variable value
Time Between >
From: 07:30
To: 22:30

Actions:
Profile: Home



- - - - -
Event name: 2 - Night
Event group (optional): Profile mgmt

Conditions (match all):
Time Between >
From: 22:30
To: 06:30
Llama variable >
Variable name: loc
Current value
Variable value : home

Actions:
Profile: Night
Screen Brightness: 40%
Screen rotation: No rotation
Queue another event >
Event name: 2a - Night turn off

Conditions :
Time Between >
From: 06:30
To: 23:00
Charging Status: Using battery

Actions:
Run App Shortcut: Home
Screen rotation: Rotation on
Screen Brightness: 80%
Llama variable >
Variable name: night
Variable value

Llama variable >
Variable name: night
Variable value : yes



- - - - -
Event name: 3 - Away
Event group (optional): Profile mgmt

Advanced...:
Delay event
Delay minutes: 1 minutes

Conditions (match all):
Llama variable >
Variable name: loc
Current value
Variable value : away

Actions:
Profile: Away
Mobile Data: Mobile Data On


- - - - -
Event name: 4 - Work
Event group (optional): Profile mgmt

Conditions (match all):
Llama variable >
Variable name: loc
Current value
Variable value : work

Actions:
Profile: Quiet

Events:
When 'loc' has a value of 'work' - change profile to Quiet
 
Last edited:

calisro

Member
Apr 7, 2011
6
0
0
Visit site
DO NOT forget to turn on wifi polling the settings for the events above to work. Don't worry about the interval as the events will take care of that and toggle between 'never' and 5 minutes. These events will need that.

reference for 'sharing' events:
KebabApps: Ooo, a social Llama
 
Last edited:

MathP

New member
Jul 16, 2013
0
0
0
Visit site
Hello Calisro,

thanks a lot for sharing rsync & scripting method: I may try to setup some rsync with my NAS.

Regarding WiFi polling, I can't manage to import directly from you links in Llama:
- is it because there are several events in it?
- in the end, by using the Human-readable description tool of a Llama event URL created using the share menu, I managed to decode them (but again not to import them directly in Llama

Did you also configured your Llama configuration to do Juice Defender job (like kurokirasama)? I'm currently studying it

Regards, Math
 

calisro

Member
Apr 7, 2011
6
0
0
Visit site
hmm. llama won't launch for me either to import my own from the url. I even tried from gmail on my phone. I edited the above to include the 'english' versions of the encoded URLs for others having issues importing directly to llama. :)

I've not done juice defender type things.
 
Last edited:

npaladin-2000

Well-known member
Mar 3, 2010
1,175
11
0
Visit site
Hey guys, I skimmed the thread and nothing jumped out at me regarding this, so I figured I'd ask. I've got an LG Optimus G Pro, which I love to death, but my car charger can't keep up with, just because of that (frigging lovely) screen. So what I'm thinking is to figure out a way to throttle the CPU when certain applications (WAZE, Google Maps/Navigation, Car Dashboard, etc) are running, maybe give the charger half a chance to keep up. I was running Waze on the way home today and I actually LOST battery on the way (something like 4%). Obviously disabling GPS and data are not options here, I need to save power other ways if there's any way to do it.

So anyone figure out how to use Llama to throttle the CPU?
 

calisro

Member
Apr 7, 2011
6
0
0
Visit site
Hey guys, I skimmed the thread and nothing jumped out at me regarding this, so I figured I'd ask. I've got an LG Optimus G Pro, which I love to death, but my car charger can't keep up with, just because of that (frigging lovely) screen. So what I'm thinking is to figure out a way to throttle the CPU when certain applications (WAZE, Google Maps/Navigation, Car Dashboard, etc) are running, maybe give the charger half a chance to keep up. I was running Waze on the way home today and I actually LOST battery on the way (something like 4%). Obviously disabling GPS and data are not options here, I need to save power other ways if there's any way to do it.

So anyone figure out how to use Llama to throttle the CPU?

You can do it with llama and sl4a. If you install sl4a you will be able to run shell scripts as a localle plugin. that script can run a root command to modify the cpu step. You would create say 2 shell scripts. One for your "slow" cpu speed and one for your normal speed. then you would call each one depending on the condition.

your script would look something like this:

su -c 'echo 700000 > /sys/devices/system/cpu/cpu0/cpufreq/scaling_max_freq'

that would set the max to 700mhz for example. assuming your kernel supports this. ;)
 

calisro

Member
Apr 7, 2011
6
0
0
Visit site
Hey guys, I skimmed the thread and nothing jumped out at me regarding this, so I figured I'd ask. I've got an LG Optimus G Pro, which I love to death, but my car charger can't keep up with, just because of that (frigging lovely) screen. So what I'm thinking is to figure out a way to throttle the CPU when certain applications (WAZE, Google Maps/Navigation, Car Dashboard, etc) are running, maybe give the charger half a chance to keep up. I was running Waze on the way home today and I actually LOST battery on the way (something like 4%). Obviously disabling GPS and data are not options here, I need to save power other ways if there's any way to do it.



So anyone figure out how to use Llama to throttle the CPU?

btw, they sell better car chargers. ones that will produce 1000ma. I bet the reason your car charger can't keep up is because its crap and only produces a "usb" rate of 300-400ma. my home charger produces 1400ma and my car charger is 1100ma and both can keep up. Most manufacturers don't disclose the output ma. if they don't, its crap. don't buy it.
 

npaladin-2000

Well-known member
Mar 3, 2010
1,175
11
0
Visit site
btw, they sell better car chargers. ones that will produce 1000ma. I bet the reason your car charger can't keep up is because its crap and only produces a "usb" rate of 300-400ma. my home charger produces 1400ma and my car charger is 1100ma and both can keep up. Most manufacturers don't disclose the output ma. if they don't, its crap. don't buy it.

Actually this is one of the 1+ amp ones. Might need a 2 amp.

Sent from my LG Optimus G Pro using Tapatalk 2.
 

calisro

Member
Apr 7, 2011
6
0
0
Visit site
(rooted devices only!) Training my Llama to learn shell, python, and intents....

Requirements: llama, SL4A app, and Python language for SL4A

I thought i'd share some other interesting things (geeky) I do with llama that involve llama and sl4a. This will be over many people's heads but some may find it door opening. The ability to send variables to llama directly opens the door to extending llama far beyond its current state.

In talking to the developer he gave me a llama intent that can be executed to set a llama variable outside of the llama program (from the root command line for example). This is entirely unsupported and undocumented btw. This means that I can now execute use OS commands to check phone conditions and report that condition back to llama for action. One example that I use is this:

When I connect to my work wifi, I want to ensure that my openVPN is always ON when im connected. So I wrote a llama event to do just that:
Code:
When ( phone is connected to "work wifi" AND llama variable LOC = work AND screen turns on )  
    Run a shortcut OPENVPN.

This worked great but I found that it was annoyingly reconnecting everytime my screen turns on. If removed the screen on, then it would only connect when I initially connected to my work wifi and if it became disconnected throughout the day (happens a lot) then that sucked.

So I changed my action to:
Code:
Condition When ( phone is connected to "work wifi" AND llama variable LOC = work AND screen turns on )  
    Run a locale plugin: [B]SL4A: chk-vpn.py[/B]
    QUEUE and event named "VPN Fire"         
         VPNFire:  (delay 7 seconds) Conditions When llama variable 'VPNConnected" has a value of 'False' 
                   Run a shortcut OPENVPN.
                   Set llama variable VPNConnected to 'True'

In the SL4A app, I have the script chk-vpn.py (a python script) check my home network is visable and set a llama variable depending on its status:

script: chk-vpn.py
Code:
import socket
import subprocess
#function to do a port check on a particular address.  very lightweight.
def chk_server(address,port):
    s = socket.socket()
    s.settimeout(4)
    try:
        s.connect((address,port))
        return True
    except socket.error, e:
        return False
#modify to any open port on your router and your routers internal address
check = chk_server("192.168.1.1", 80)
script = "/sdcard/sl4a/scripts/llamavar.sh"
var = "VPNConnected"
val = str(check)
subprocess.call(["/system/bin/sh" , script, var , val ])

shell script: "/sdcard/sl4a/scripts/llamavar.sh" (referenced by above program)
Code:
#!/system/bin/sh
/system/xbin/su -c "am broadcast -a com.kebab.Llama.SetLlamaVariable --es VariableName $1 --es VariableValue $2 com.kebab.Llama/com.kebab.Llama.ExportedReceiver"

Now my 2 scripts will check if VPN is in fact connected and let llama know. :)
 
Last edited:

calisro

Member
Apr 7, 2011
6
0
0
Visit site
Actually this is one of the 1+ amp ones. Might need a 2 amp.
I'm not doubting you. But just because it says 1+ amps doesn't mean it is delivering 1+ amps PER PORT. Lots of these multi-usb plug chargers claim 1+ amps when they have multiple ports but only deliver a nominal (less than) 500ma for each port. This is totally off topic of this thread and my apologies to everyone. Anyway best of luck but yes try llama changing your CPU or perhaps your governer to /conservative/ (which can be done also the same way as above).
 

npaladin-2000

Well-known member
Mar 3, 2010
1,175
11
0
Visit site
I'm not doubting you. But just because it says 1+ amps doesn't mean it is delivering 1+ amps PER PORT. Lots of these multi-usb plug chargers claim 1+ amps when they have multiple ports but only deliver a nominal (less than) 500ma for each port. This is totally off topic of this thread and my apologies to everyone. Anyway best of luck but yes try llama changing your CPU or perhaps your governer to /conservative/ (which can be done also the same way as above).

It's one port. ;) Still, it's probably not delivering any more than 1.1 amps, and that probably isn't cutting it. Sorry for hijacking the thread. I should be able to find a 2 amp one somewhere. Of course, if that doesn't cut it I'll still need to do some CPU throttling. :)
 

nickmax1

New member
Mar 19, 2013
0
0
0
Visit site
everyone upgrade to the new llama just released to google play - has some new things in there

I have done a quick look and some new conditions have been added which is cool
 

dondove

New member
Jun 13, 2011
0
0
0
Visit site
When Bluetooth(in my truck) connects > launch "Slacker"
When Bluetooth disconnects > kill with root "Slacker"

This used to work great but after a couple of recent Slacker updates, kill with root Slacker is buggy.
 

calisro

Member
Apr 7, 2011
6
0
0
Visit site
When Bluetooth(in my truck) connects > launch "Slacker"
When Bluetooth disconnects > kill with root "Slacker"

This used to work great but after a couple of recent Slacker updates, kill with root Slacker is buggy.
Slacker uses a background service that may restart itself when you kill its process. Is that what you are seeing? You may be able to send slacker an intent to interact with it like to stop playback rather than killing it.
 

Rodrigo Batista

New member
Jul 4, 2013
0
0
0
Visit site
Is anybody having issues with Llama on JellyBean 4.3? I have the latest Llama from the play store (1.2013.08.10.2211) running on a stock 4.3 phone and am getting a notification saying "Llama could not get root access in time". All actions are performed properly, but I would like to get rid of the notification.

By the way, I am running SuperSU 1.51 and Llama has been granted su access rights.

Rodrigo
 

Olivier D

New member
Mar 29, 2013
3
0
0
Visit site
Is anybody having issues with Llama on JellyBean 4.3? I have the latest Llama from the play store (1.2013.08.10.2211) running on a stock 4.3 phone and am getting a notification saying "Llama could not get root access in time". All actions are performed properly, but I would like to get rid of the notification.

By the way, I am running SuperSU 1.51 and Llama has been granted su access rights.

Rodrigo

I would help you but I'm currently using CyanogenMod and their latest stable is 4.2 based