[HOWTO] Build /e/ without docker for non LineageOS supported devices

yes, sorry, I’m blind. Some post before he has written that he will use pie dipper for testing his build environment

Hello

To clarify. I get the same error if I build for s2 oreo or dipper pie. I get this error on Google cloud compute and vultr both with Ubuntu vm and 30-32gb of ram.

I’ve attempted a vanilla run of build.sh unmodified as s2 oreo. Pastebin and screen shot in my previous post.

My logs are empty. I can see the log files for s2 and dipper but they contain no content

Could you pls try this command manually in terminal

git clone https://gitlab.e.foundation/e/os/docker-lineage-cicd.git

Cloning into 'docker-lineage-cicd'...
remote: Enumerating objects: 708, done.
remote: Counting objects: 100% (708/708), done.
remote: Compressing objects: 100% (347/347), done.
remote: Total 708 (delta 426), reused 624 (delta 360)
Receiving objects: 100% (708/708), 704.05 KiB | 11.00 MiB/s, done.
Resolving deltas: 100% (426/426), done.

Looks good and now start the script again (as root)

Cleaned out srv folder and recreated buile.sh from post with no changes, chmod 755 script

cloned https://gitlab.e.foundation/e/os/docker-lineage-cicd.git

Then ./builde.sh as root

Cloning into '/srv/tmp/buildscripts'...
remote: Enumerating objects: 708, done.
remote: Counting objects: 100% (708/708), done.
remote: Compressing objects: 100% (347/347), done.
remote: Total 708 (delta 426), reused 624 (delta 360)
Receiving objects: 100% (708/708), 704.05 KiB | 11.54 MiB/s, done.
Resolving deltas: 100% (426/426), done.
mkdir: cannot create directory ‘delta’: File exists
Cloning into 'OpenDelta'...
remote: Enumerating objects: 226, done.
remote: Counting objects: 100% (226/226), done.
remote: Compressing objects: 100% (166/166), done.
remote: Total 226 (delta 30), reused 186 (delta 13), pack-reused 0
Receiving objects: 100% (226/226), 644.03 KiB | 3.93 MiB/s, done.
Resolving deltas: 100% (30/30), done.
Set cache size limit to 50.0 GB
>> [Wed Dec  4 20:14:10 UTC 2019] Branch:  v1-oreo
>> [Wed Dec  4 20:14:10 UTC 2019] Devices: s2,
>> [Wed Dec  4 20:14:10 UTC 2019] (Re)initializing branch repository
>> [Wed Dec  4 20:14:13 UTC 2019] Copying '/srv/local_manifests/*.xml' to '.repo/local_manifests/'
>> [Wed Dec  4 20:14:13 UTC 2019] Use branch lineage-15.1 on github.com/TheMuppets
>> [Wed Dec  4 20:14:13 UTC 2019] Syncing branch repository
>> [Wed Dec  4 20:42:36 UTC 2019] Applying the restricted signature spoofing patch (based on 
android_frameworks_base-O.patch) to frameworks/base
>> [Wed Dec  4 20:42:37 UTC 2019] Setting "UNOFFICIAL" as release type
>> [Wed Dec  4 20:42:37 UTC 2019] Adding OTA URL overlay (for custom URL )
>> [Wed Dec  4 20:42:37 UTC 2019] Adding custom packages (MuPDF GmsCore GsfProxy 
FakeStore com.google.android.maps.jar Telegram Signal Mail BlissLauncher BlissIconPack 
MozillaNlpBackend OpenWeatherMapWeatherProvider AccountManager MagicEarth OpenCamera 
eDrive Weather Notes Tasks NominatimNlpBackend Light DroidGuard OpenKeychain Message 
Browser BrowserWebView Apps LibreOfficeViewer)
>> [Wed Dec  4 20:42:37 UTC 2019] Using OpenJDK 8
>> [Wed Dec  4 20:42:37 UTC 2019] Preparing build environment
>> [Wed Dec  4 20:42:38 UTC 2019] Starting build for s2, v1-oreo branch
ANDROID_JACK_VM_ARGS=-Dfile.encoding=UTF-8 -XX:+TieredCompilation -Xmx4G
>> [Wed Dec  4 20:46:08 UTC 2019] Failed build for s2
>> [Wed Dec  4 20:46:08 UTC 2019] Finishing build for s2
>> [Wed Dec  4 20:46:08 UTC 2019] Cleaning source dir for device s2

You have tried without the ‘Jack’ command ?

And is JACK installed on your PC ?

Can you post an “ls -alp” from your /root/ folder?

drwx------ 12 root root  4096 Dec  4 20:57 ./
drwxr-xr-x 25 root root  4096 Dec  3 22:31 ../
-rw-------  1 root root  4421 Dec  4 16:11 .bash_history
-rw-r--r--  1 root root  3107 Dec  4 14:45 .bashrc
drwxr-xr-x  2 root root  4096 Dec  3 22:31 bin/
-rwxr-xr-x  1 root root 18148 Dec  4 20:57 build.sh
drwx------  3 root root  4096 Dec  3 23:18 .cache/
-rwxr-xr-x  1 root root  3951 Dec  4 20:57 clean_up.py
drwxr-xr-x  2 root root  4096 Dec  4 20:57 delta/
drwxr-xr-x  4 root root  4096 Dec  4 19:46 docker-lineage-cicd/
-rwxr-xr-x  1 root root    49 Dec  4 20:57 fix_date.sh
-rw-r--r--  1 root root    44 Dec  4 20:57 .gitconfig
drwx------  3 root root  4096 Nov  8 12:24 .gnupg/
-rwxr-xr-x  1 root root  2996 Dec  4 20:57 init.sh
drwxr-xr-x  3 root root  4096 Dec  3 22:28 .local/
-rwxr-xr-x  1 root root  2266 Dec  4 20:57 make_key
-rwxr-xr-x  1 root root  1957 Dec  4 20:57 opendelta_builds_json.py
-rw-r--r--  1 root root   218 Dec  4 20:57 packages_updater_strings.xml
-rw-r--r--  1 root root   148 Aug 17  2015 .profile
drwxr-xr-x  3 root root  4096 Dec  3 22:39 .repoconfig/
-rw-r--r--  1 root root    80 Dec  4 20:57 .repo_.gitconfig.json
drwxr-xr-x  2 root root  4096 Dec  3 22:31 signature_spoofing_patches/
drwxr-xr-x  2 root root  4096 Dec  3 22:31 userscripts/
-rw-r--r--  1 root root   180 Dec  4 20:57 .wget-hsts
drwxr-xr-x  2 root root  4096 Nov  8 12:24 xhprof/

I setup another VM on vultr. Ubunu 18.4, fresh install with just an apt update and upgrade then ran the script as is and same results again

could you pls try this https://del.dog/sailfish script ? It was used from me and 2 other users for 2 weeks successfully.

>> [Thu Dec  5 11:13:12 UTC 2019] Using OpenJDK 8
>> [Thu Dec  5 11:13:12 UTC 2019] Preparing build environment
>> [Thu Dec  5 11:13:13 UTC 2019] Starting build for sailfish, v1-pie branch
ANDROID_JACK_VM_ARGS=-Dfile.encoding=UTF-8 -XX:+TieredCompilation -Xmx4G
>> [Thu Dec  5 11:21:02 UTC 2019] Failed build for sailfish
>> [Thu Dec  5 11:21:02 UTC 2019] Finishing build for sailfish

weird, failed again. Must VM I guess.

OK, sorry, but I’m at the end of my knowledge. Sorry :pensive:

I have just run the script again and build is running as it should
TARGET_CPU_VARIANT=kryo
TARGET_2ND_ARCH=arm
TARGET_2ND_ARCH_VARIANT=armv8-a
TARGET_2ND_CPU_VARIANT=kryo
HOST_ARCH=x86_64
HOST_2ND_ARCH=x86
HOST_OS=linux
HOST_OS_EXTRA=Linux-4.15.0-72-generic-x86_64-Linux-Mint-19.1
HOST_CROSS_OS=windows
HOST_CROSS_ARCH=x86
HOST_CROSS_2ND_ARCH=x86_64
HOST_BUILD_TYPE=release
BUILD_ID=PQ3A.190801.002
OUT_DIR=/srv/e/src/PIE/out
WITH_SU=false

[1/1] /srv/e/src/PIE/out/soong/.minibootstrap/minibp /srv/e/src/PIE/out/soong/.bootstrap/build.ninja
[1/56] compile /srv/e/src/PIE/out/soong/.bootstrap/blueprint-deptools/pkg/github.com/google/blueprint/deptools.a
[2/56] compile /srv/e/src/PIE/out/soong/.bootstrap/gotestrunner/obj/gotestrunner.a
[3/56] compile /srv/e/src/PIE/out/soong/.bootstrap/gotestmain/obj/gotestmain.a
[4/56] compile /srv/e/src/PIE/out/soong/.bootstrap/blueprint-pathtools/pkg/github.com/google/blueprint/pathtools.a
[5/56] compile /srv/e/src/PIE/out/soong/.bootstrap/blueprint-pathtools/test/github.com/google/blueprint/pathtools.a
[6/56] compile /srv/e/src/PIE/out/soong/.bootstrap/bpglob/obj/bpglob.a
[7/56] link /srv/e/src/PIE/out/soong/.bootstrap/gotestrunner/obj/a.out
[8/56] cp /srv/e/src/PIE/out/soong/.bootstrap/bin/gotestrunner
[9/56] link /srv/e/src/PIE/out/soong/.bootstrap/bpglob/obj/a.out
[10/56] link /srv/e/src/PIE/out/soong/.bootstrap/gotestmain/obj/a.out
[11/56] cp /srv/e/src/PIE/out/soong/.bootstrap/bin/gotestmain
[12/56] gotestmain /srv/e/src/PIE/out/soong/.bootstrap/blueprint-pathtools/test/test.go
[13/56] compile /srv/e/src/PIE/out/soong/.bootstrap/blueprint-pathtools/test/test.a
[14/56] link /srv/e/src/PIE/out/soong/.bootstrap/bluepri

That seems normal, without logs it will be difficult to help, are you sure log file /srv/logs/s2 is empty?

managed to get a complete log after the Sailfish script ran. Is this because unzip is not installed? I’ve installed it now and run the sailfish script again so far it’s up to the ANDROID_JACK_VM_ARGS line again with no failure

https://del.dog/edinyrfele

SOLVED!

Unzip package was missing on my system

Thanks for your help an patience guys

@eggzenbeanz did you get it to build for Clover???

not yet - I’ve successfully built sailfish as per @harvey186 script. I’m now building dipper using the same script and checking if that works to then flash on my mi8. Once that is complete I’ll tackle clover

OK so now to some build questions.

I have found my device, kernel & vendor on git hub and forked them to my repo.

I have some questions around the git clone

Where are these .git repos being cloned into? /srv/local_manifests ?

So I would end up with a structure like /srv/local_manifests/device/xiaomi/clover ?

if not - where do I clone the repos to?

I’m using this git clone method to get the lineage-16.0 branch. is that correct?
git clone -b lineage-16.0 --all https://github.com/vanhoopstallion/android_device_xiaomi_clover.git

My roomservice.xml then looks like this

<?xml version="1.0" encoding="UTF-8"?>
<manifest>
 <project name="vanhoopstallion/android_kernel_xiaomi_clover" path="kernel/xiaomi/clover" 
 remote="github" />
 <project name="vanhoopstallion/android_device_xiaomi_clover" path="device/xiaomi/clover" 
 remote="github" />
 <project name="vanhoopstallion/proprietary_vendor_xiaomi_clover" path="vendor/xiaomi/clover" 
 remote="github" />
 <project name="LineageOS/android_packages_resources_devicesettings" 
 path="packages/resources/devicesettings" remote="github" />
</manifest>

Where should the path be pointing to? is this the directory I cloned the repos into? Does the path have to be absolute?

I think you can just go to your /srv/e//src/PIE folder open termanal and run commads in this format:

sudo git clone https://github.com/MyCats/android_device_xiaomi_whyred device/xiaomi/whyred -b lineage-16.0

sudo git clone https://github.com/MyCats/android_kernel_xiaomi_whyred kernel/xiaomi/whyred -b lineage-16.0

sudo git clone https://github.com/MyCats/android_vendor_xiaomi_whyred vendor/xiaomi/whyred -b lineage-16.0

sudo git clone https://github.com/LineageOS/android_packages_resources_devicesettings packages/resources/devicesettings -b lineage-16.0