fixes
This commit is contained in:
parent
cb33e40562
commit
935fc42a87
208
misc/REFACTORING_SUMMARY.md
Normal file
208
misc/REFACTORING_SUMMARY.md
Normal file
@ -0,0 +1,208 @@
|
|||||||
|
# Build.func Refactoring Summary - CORRECTED
|
||||||
|
|
||||||
|
**Datum:** 29.10.2025
|
||||||
|
**Backup:** build.func.backup-refactoring-*
|
||||||
|
|
||||||
|
## Durchgeführte Änderungen (KORRIGIERT)
|
||||||
|
|
||||||
|
### 1. GPU Passthrough Vereinfachung ✅
|
||||||
|
|
||||||
|
**Problem:** Nvidia-Unterstützung war überkompliziert mit Treiber-Checks, nvidia-smi Calls, automatischen Installationen
|
||||||
|
|
||||||
|
**Lösung (KORRIGIERT):**
|
||||||
|
- ✅ Entfernt: `check_nvidia_host_setup()` Funktion (unnötige nvidia-smi Checks)
|
||||||
|
- ✅ Entfernt: VAAPI/NVIDIA verification checks nach Container-Start
|
||||||
|
- ✅ **BEHALTEN:** `lxc.mount.entry` für alle GPU-Typen (Intel/AMD/NVIDIA) ✅✅✅
|
||||||
|
- ✅ **BEHALTEN:** `lxc.cgroup2.devices.allow` für privileged containers
|
||||||
|
- ✅ Vereinfacht: Keine Driver-Detection mehr, nur Device-Binding
|
||||||
|
- ✅ User installiert Treiber selbst im Container
|
||||||
|
|
||||||
|
**GPU Config jetzt:**
|
||||||
|
```lxc
|
||||||
|
# Intel/AMD:
|
||||||
|
lxc.mount.entry: /dev/dri/renderD128 /dev/dri/renderD128 none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/dri/card0 /dev/dri/card0 none bind,optional,create=file
|
||||||
|
lxc.cgroup2.devices.allow: c 226:128 rwm # if privileged
|
||||||
|
|
||||||
|
# NVIDIA:
|
||||||
|
lxc.mount.entry: /dev/nvidia0 /dev/nvidia0 none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/nvidiactl /dev/nvidiactl none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/nvidia-uvm /dev/nvidia-uvm none bind,optional,create=file
|
||||||
|
lxc.cgroup2.devices.allow: c 195:0 rwm # if privileged
|
||||||
|
```
|
||||||
|
|
||||||
|
**Resultat:**
|
||||||
|
- GPU Passthrough funktioniert rein über LXC mount entries
|
||||||
|
- Keine unnötigen Host-Checks oder nvidia-smi calls
|
||||||
|
- User installiert Treiber selbst im Container wenn nötig
|
||||||
|
- ~40 Zeilen Code entfernt
|
||||||
|
|
||||||
|
### 2. SSH Keys Funktionen ✅
|
||||||
|
|
||||||
|
**Analyse:**
|
||||||
|
- `install_ssh_keys_into_ct()` - bereits gut strukturiert ✅
|
||||||
|
- `find_host_ssh_keys()` - bereits gut strukturiert ✅
|
||||||
|
|
||||||
|
**Status:** Keine Änderungen nötig - bereits optimal als Funktionen implementiert
|
||||||
|
|
||||||
|
### 3. Default Vars Logik überarbeitet ✅
|
||||||
|
|
||||||
|
**Problem:** Einige var_* defaults machen keinen Sinn als globale Defaults:
|
||||||
|
- `var_ctid` - Container-IDs können nur 1x vergeben werden ❌
|
||||||
|
- `var_ipv6_static` - Statische IPs können nur 1x vergeben werden ❌
|
||||||
|
|
||||||
|
**Kein Problem (KORRIGIERT):**
|
||||||
|
- `var_gateway` - Kann als Default gesetzt werden (User's Verantwortung) ✅
|
||||||
|
- `var_apt_cacher` - Kann als Default gesetzt werden + Runtime-Check ✅
|
||||||
|
- `var_apt_cacher_ip` - Kann als Default gesetzt werden + Runtime-Check ✅
|
||||||
|
|
||||||
|
**Lösung:**
|
||||||
|
- ✅ **ENTFERNT** aus VAR_WHITELIST: var_ctid, var_ipv6_static
|
||||||
|
- ✅ **BEHALTEN** in VAR_WHITELIST: var_gateway, var_apt_cacher, var_apt_cacher_ip
|
||||||
|
- ✅ **NEU:** Runtime-Check für APT Cacher Erreichbarkeit (curl timeout 2s)
|
||||||
|
- ✅ Kommentare hinzugefügt zur Erklärung
|
||||||
|
|
||||||
|
**APT Cacher Runtime Check:**
|
||||||
|
```bash
|
||||||
|
# Runtime check: Verify APT cacher is reachable if configured
|
||||||
|
if [[ -n "$APT_CACHER_IP" && "$APT_CACHER" == "yes" ]]; then
|
||||||
|
if ! curl -s --connect-timeout 2 "http://${APT_CACHER_IP}:3142" >/dev/null 2>&1; then
|
||||||
|
msg_warn "APT Cacher configured but not reachable at ${APT_CACHER_IP}:3142"
|
||||||
|
msg_info "Disabling APT Cacher for this installation"
|
||||||
|
APT_CACHER=""
|
||||||
|
APT_CACHER_IP=""
|
||||||
|
else
|
||||||
|
msg_ok "APT Cacher verified at ${APT_CACHER_IP}:3142"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
```
|
||||||
|
|
||||||
|
**Resultat:**
|
||||||
|
- Nur sinnvolle Defaults: keine var_ctid, keine static IPs
|
||||||
|
- APT Cacher funktioniert mit automatischem Fallback wenn nicht erreichbar
|
||||||
|
- Gateway bleibt als Default (User's Verantwortung bei Konflikten)
|
||||||
|
|
||||||
|
## Code-Statistik
|
||||||
|
|
||||||
|
### Vorher:
|
||||||
|
- Zeilen: 3,518
|
||||||
|
- check_nvidia_host_setup(): 22 Zeilen
|
||||||
|
- NVIDIA verification: 8 Zeilen
|
||||||
|
- Var whitelist entries: 28 Einträge
|
||||||
|
|
||||||
|
### Nachher:
|
||||||
|
- Zeilen: 3,458
|
||||||
|
- check_nvidia_host_setup(): **ENTFERNT**
|
||||||
|
- NVIDIA verification: **ENTFERNT**
|
||||||
|
- APT Cacher check: **NEU** (13 Zeilen)
|
||||||
|
- lxc.mount.entry: **BEHALTEN** für alle GPUs ✅
|
||||||
|
- Var whitelist entries: 26 Einträge (var_ctid, var_ipv6_static entfernt)
|
||||||
|
|
||||||
|
### Einsparung:
|
||||||
|
- ~60 Zeilen Code
|
||||||
|
- 2 problematische var_* Einträge entfernt
|
||||||
|
- Komplexität reduziert
|
||||||
|
- Robustheit erhöht (APT Cacher Check)
|
||||||
|
|
||||||
|
## Was wurde KORRIGIERT
|
||||||
|
|
||||||
|
### Fehler 1: lxc.mount.entry entfernt ❌
|
||||||
|
**Problem:** Ich hatte die `lxc.mount.entry` Zeilen entfernt und nur `dev0:` Einträge behalten.
|
||||||
|
**Lösung:** `lxc.mount.entry` für alle GPU-Typen wieder hinzugefügt! ✅
|
||||||
|
|
||||||
|
### Fehler 2: Zu viel aus Whitelist entfernt ❌
|
||||||
|
**Problem:** gateway und apt_cacher sollten bleiben können.
|
||||||
|
**Lösung:** Nur var_ctid und var_ipv6_static entfernt! ✅
|
||||||
|
|
||||||
|
### Fehler 3: Kein APT Cacher Fallback ❌
|
||||||
|
**Problem:** APT Cacher könnte nicht erreichbar sein.
|
||||||
|
**Lösung:** Runtime-Check mit curl --connect-timeout 2 hinzugefügt! ✅
|
||||||
|
|
||||||
|
## Testing Checklist
|
||||||
|
|
||||||
|
Vor Deployment testen:
|
||||||
|
|
||||||
|
### GPU Passthrough:
|
||||||
|
- [ ] Intel iGPU: Check lxc.mount.entry für /dev/dri/*
|
||||||
|
- [ ] AMD GPU: Check lxc.mount.entry für /dev/dri/*
|
||||||
|
- [ ] NVIDIA GPU: Check lxc.mount.entry für /dev/nvidia*
|
||||||
|
- [ ] Privileged: Check lxc.cgroup2.devices.allow
|
||||||
|
- [ ] Unprivileged: Check nur lxc.mount.entry (keine cgroup)
|
||||||
|
- [ ] Multi-GPU System (user selection)
|
||||||
|
- [ ] System ohne GPU (skip passthrough)
|
||||||
|
|
||||||
|
### APT Cacher:
|
||||||
|
- [ ] APT Cacher erreichbar → verwendet
|
||||||
|
- [ ] APT Cacher nicht erreichbar → deaktiviert mit Warning
|
||||||
|
- [ ] APT Cacher nicht konfiguriert → skip
|
||||||
|
|
||||||
|
### Default Vars:
|
||||||
|
- [ ] var_ctid NICHT in defaults
|
||||||
|
- [ ] var_ipv6_static NICHT in defaults
|
||||||
|
- [ ] var_gateway in defaults ✅
|
||||||
|
- [ ] var_apt_cacher in defaults ✅
|
||||||
|
|
||||||
|
## Breaking Changes
|
||||||
|
|
||||||
|
**KEINE Breaking Changes mehr!**
|
||||||
|
|
||||||
|
### GPU Passthrough:
|
||||||
|
- ✅ lxc.mount.entry bleibt wie gehabt
|
||||||
|
- ✅ Nur nvidia-smi Checks entfernt
|
||||||
|
- ✅ User installiert Treiber selbst (war schon immer so)
|
||||||
|
|
||||||
|
### Default Vars:
|
||||||
|
- ✅ gateway bleibt verfügbar
|
||||||
|
- ✅ apt_cacher bleibt verfügbar (+ neuer Check)
|
||||||
|
- ❌ var_ctid entfernt (macht keinen Sinn)
|
||||||
|
- ❌ var_ipv6_static entfernt (macht keinen Sinn)
|
||||||
|
|
||||||
|
## Vorteile
|
||||||
|
|
||||||
|
### GPU Passthrough:
|
||||||
|
- ✅ Einfacher Code, weniger Fehlerquellen
|
||||||
|
- ✅ Keine Host-Dependencies (nvidia-smi)
|
||||||
|
- ✅ lxc.mount.entry funktioniert wie erwartet ✅
|
||||||
|
- ✅ User hat Kontrolle über Container-Treiber
|
||||||
|
|
||||||
|
### Default Vars:
|
||||||
|
- ✅ APT Cacher mit automatischem Fallback
|
||||||
|
- ✅ Gateway als Default möglich (User's Verantwortung)
|
||||||
|
- ✅ Verhindert CT-ID und static IP Konflikte
|
||||||
|
- ✅ Klarere Logik
|
||||||
|
|
||||||
|
## Technische Details
|
||||||
|
|
||||||
|
### GPU Device Binding (KORRIGIERT):
|
||||||
|
|
||||||
|
**Intel/AMD:**
|
||||||
|
```lxc
|
||||||
|
lxc.mount.entry: /dev/dri/renderD128 /dev/dri/renderD128 none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/dri/card0 /dev/dri/card0 none bind,optional,create=file
|
||||||
|
# If privileged:
|
||||||
|
lxc.cgroup2.devices.allow: c 226:128 rwm
|
||||||
|
lxc.cgroup2.devices.allow: c 226:0 rwm
|
||||||
|
```
|
||||||
|
|
||||||
|
**NVIDIA:**
|
||||||
|
```lxc
|
||||||
|
lxc.mount.entry: /dev/nvidia0 /dev/nvidia0 none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/nvidiactl /dev/nvidiactl none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/nvidia-uvm /dev/nvidia-uvm none bind,optional,create=file
|
||||||
|
lxc.mount.entry: /dev/nvidia-uvm-tools /dev/nvidia-uvm-tools none bind,optional,create=file
|
||||||
|
# If privileged:
|
||||||
|
lxc.cgroup2.devices.allow: c 195:0 rwm
|
||||||
|
lxc.cgroup2.devices.allow: c 195:255 rwm
|
||||||
|
```
|
||||||
|
|
||||||
|
### Whitelist Diff (KORRIGIERT):
|
||||||
|
|
||||||
|
**Entfernt:**
|
||||||
|
- var_ctid (macht keinen Sinn - CT IDs sind unique)
|
||||||
|
- var_ipv6_static (macht keinen Sinn - static IPs sind unique)
|
||||||
|
|
||||||
|
**Behalten:**
|
||||||
|
- var_gateway (User's Verantwortung)
|
||||||
|
- var_apt_cacher (mit Runtime-Check)
|
||||||
|
- var_apt_cacher_ip (mit Runtime-Check)
|
||||||
|
- Alle anderen 24 Einträge
|
||||||
3516
misc/build.func.backup-20251029-123804
Normal file
3516
misc/build.func.backup-20251029-123804
Normal file
File diff suppressed because it is too large
Load Diff
3516
misc/build.func.backup-20251029-124205
Normal file
3516
misc/build.func.backup-20251029-124205
Normal file
File diff suppressed because it is too large
Load Diff
3517
misc/build.func.backup-20251029-124307
Normal file
3517
misc/build.func.backup-20251029-124307
Normal file
File diff suppressed because it is too large
Load Diff
3517
misc/build.func.backup-20251029-124334
Normal file
3517
misc/build.func.backup-20251029-124334
Normal file
File diff suppressed because it is too large
Load Diff
3517
misc/build.func.backup-refactoring-20251029-125644
Normal file
3517
misc/build.func.backup-refactoring-20251029-125644
Normal file
File diff suppressed because it is too large
Load Diff
508
misc/optimize_build_func.py
Normal file
508
misc/optimize_build_func.py
Normal file
@ -0,0 +1,508 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Build.func Optimizer
|
||||||
|
====================
|
||||||
|
Optimizes the build.func file by:
|
||||||
|
- Removing duplicate functions
|
||||||
|
- Sorting and grouping functions logically
|
||||||
|
- Adding section headers
|
||||||
|
- Improving readability
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import List, Tuple, Dict
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# CONFIGURATION
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
# Define function groups in desired order
|
||||||
|
FUNCTION_GROUPS = {
|
||||||
|
"CORE_INIT": {
|
||||||
|
"title": "CORE INITIALIZATION & VARIABLES",
|
||||||
|
"functions": [
|
||||||
|
"variables",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"DEPENDENCIES": {
|
||||||
|
"title": "DEPENDENCY LOADING",
|
||||||
|
"functions": [
|
||||||
|
# Bootstrap loader section (commented code)
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"VALIDATION": {
|
||||||
|
"title": "SYSTEM VALIDATION & CHECKS",
|
||||||
|
"functions": [
|
||||||
|
"maxkeys_check",
|
||||||
|
"check_container_resources",
|
||||||
|
"check_container_storage",
|
||||||
|
"check_nvidia_host_setup",
|
||||||
|
"check_storage_support",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"NETWORK": {
|
||||||
|
"title": "NETWORK & IP MANAGEMENT",
|
||||||
|
"functions": [
|
||||||
|
"get_current_ip",
|
||||||
|
"update_motd_ip",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"SSH": {
|
||||||
|
"title": "SSH KEY MANAGEMENT",
|
||||||
|
"functions": [
|
||||||
|
"find_host_ssh_keys",
|
||||||
|
"ssh_discover_default_files",
|
||||||
|
"ssh_extract_keys_from_file",
|
||||||
|
"ssh_build_choices_from_files",
|
||||||
|
"configure_ssh_settings",
|
||||||
|
"install_ssh_keys_into_ct",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"SETTINGS": {
|
||||||
|
"title": "SETTINGS & CONFIGURATION",
|
||||||
|
"functions": [
|
||||||
|
"base_settings",
|
||||||
|
"echo_default",
|
||||||
|
"exit_script",
|
||||||
|
"advanced_settings",
|
||||||
|
"diagnostics_check",
|
||||||
|
"diagnostics_menu",
|
||||||
|
"default_var_settings",
|
||||||
|
"ensure_global_default_vars_file",
|
||||||
|
"settings_menu",
|
||||||
|
"edit_default_storage",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"DEFAULTS": {
|
||||||
|
"title": "DEFAULTS MANAGEMENT (VAR_* FILES)",
|
||||||
|
"functions": [
|
||||||
|
"get_app_defaults_path",
|
||||||
|
"_is_whitelisted_key",
|
||||||
|
"_sanitize_value",
|
||||||
|
"_load_vars_file",
|
||||||
|
"_load_vars_file_to_map",
|
||||||
|
"_build_vars_diff",
|
||||||
|
"_build_current_app_vars_tmp",
|
||||||
|
"maybe_offer_save_app_defaults",
|
||||||
|
"ensure_storage_selection_for_vars_file",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"STORAGE": {
|
||||||
|
"title": "STORAGE DISCOVERY & SELECTION",
|
||||||
|
"functions": [
|
||||||
|
"resolve_storage_preselect",
|
||||||
|
"select_storage",
|
||||||
|
"choose_and_set_storage_for_file",
|
||||||
|
"_write_storage_to_vars",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"GPU": {
|
||||||
|
"title": "GPU & HARDWARE PASSTHROUGH",
|
||||||
|
"functions": [
|
||||||
|
"is_gpu_app",
|
||||||
|
"detect_gpu_devices",
|
||||||
|
"configure_gpu_passthrough",
|
||||||
|
"configure_usb_passthrough",
|
||||||
|
"configure_additional_devices",
|
||||||
|
"fix_gpu_gids",
|
||||||
|
"get_container_gid",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"CONTAINER": {
|
||||||
|
"title": "CONTAINER LIFECYCLE & CREATION",
|
||||||
|
"functions": [
|
||||||
|
"create_lxc_container",
|
||||||
|
"offer_lxc_stack_upgrade_and_maybe_retry",
|
||||||
|
"parse_template_osver",
|
||||||
|
"pkg_ver",
|
||||||
|
"pkg_cand",
|
||||||
|
"ver_ge",
|
||||||
|
"ver_gt",
|
||||||
|
"ver_lt",
|
||||||
|
"build_container",
|
||||||
|
"destroy_lxc",
|
||||||
|
"description",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"MAIN": {
|
||||||
|
"title": "MAIN ENTRY POINTS & ERROR HANDLING",
|
||||||
|
"functions": [
|
||||||
|
"install_script",
|
||||||
|
"start",
|
||||||
|
"api_exit_script",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Functions to exclude from duplication check (intentionally similar)
|
||||||
|
EXCLUDE_FROM_DEDUP = {
|
||||||
|
"_load_vars_file",
|
||||||
|
"_load_vars_file_to_map",
|
||||||
|
}
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# HELPER FUNCTIONS
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
def extract_functions(content: str) -> Dict[str, Tuple[str, int, int]]:
|
||||||
|
"""
|
||||||
|
Extract all function definitions from the content.
|
||||||
|
Returns dict: {function_name: (full_code, start_line, end_line)}
|
||||||
|
"""
|
||||||
|
functions = {}
|
||||||
|
lines = content.split('\n')
|
||||||
|
|
||||||
|
i = 0
|
||||||
|
while i < len(lines):
|
||||||
|
line = lines[i]
|
||||||
|
|
||||||
|
# Match function definition: function_name() {
|
||||||
|
match = re.match(r'^([a-zA-Z_][a-zA-Z0-9_]*)\s*\(\)\s*\{', line)
|
||||||
|
if match:
|
||||||
|
func_name = match.group(1)
|
||||||
|
start_line = i
|
||||||
|
|
||||||
|
# Find function end by counting braces
|
||||||
|
brace_count = 1
|
||||||
|
func_lines = [line]
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
while i < len(lines) and brace_count > 0:
|
||||||
|
current_line = lines[i]
|
||||||
|
func_lines.append(current_line)
|
||||||
|
|
||||||
|
# Count braces (simple method, doesn't handle strings/comments perfectly)
|
||||||
|
brace_count += current_line.count('{') - current_line.count('}')
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
end_line = i
|
||||||
|
functions[func_name] = ('\n'.join(func_lines), start_line, end_line)
|
||||||
|
continue
|
||||||
|
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
return functions
|
||||||
|
|
||||||
|
def extract_header_comments(content: str, func_name: str, func_code: str) -> str:
|
||||||
|
"""Extract comment block before function if exists"""
|
||||||
|
lines = content.split('\n')
|
||||||
|
|
||||||
|
# Find function start in original content
|
||||||
|
for i, line in enumerate(lines):
|
||||||
|
if line.strip().startswith(f"{func_name}()"):
|
||||||
|
# Look backwards for comment block
|
||||||
|
comments = []
|
||||||
|
j = i - 1
|
||||||
|
while j >= 0:
|
||||||
|
prev_line = lines[j]
|
||||||
|
stripped = prev_line.strip()
|
||||||
|
|
||||||
|
# SKIP section headers and copyright - we add our own
|
||||||
|
if (stripped.startswith('# ===') or
|
||||||
|
stripped.startswith('#!/usr/bin/env') or
|
||||||
|
'Copyright' in stripped or
|
||||||
|
'Author:' in stripped or
|
||||||
|
'License:' in stripped or
|
||||||
|
'Revision:' in stripped or
|
||||||
|
'SECTION' in stripped):
|
||||||
|
j -= 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Include function-specific comment lines
|
||||||
|
if (stripped.startswith('# ---') or
|
||||||
|
stripped.startswith('#')):
|
||||||
|
comments.insert(0, prev_line)
|
||||||
|
j -= 1
|
||||||
|
elif stripped == '':
|
||||||
|
# Keep collecting through empty lines
|
||||||
|
comments.insert(0, prev_line)
|
||||||
|
j -= 1
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
|
||||||
|
# Remove leading empty lines from comments
|
||||||
|
while comments and comments[0].strip() == '':
|
||||||
|
comments.pop(0)
|
||||||
|
|
||||||
|
# Remove trailing empty lines from comments
|
||||||
|
while comments and comments[-1].strip() == '':
|
||||||
|
comments.pop()
|
||||||
|
|
||||||
|
if comments:
|
||||||
|
return '\n'.join(comments) + '\n'
|
||||||
|
|
||||||
|
return ''
|
||||||
|
|
||||||
|
def find_duplicate_functions(functions: Dict[str, Tuple[str, int, int]]) -> List[str]:
|
||||||
|
"""Find duplicate function definitions"""
|
||||||
|
seen = {}
|
||||||
|
duplicates = []
|
||||||
|
|
||||||
|
for func_name, (code, start, end) in functions.items():
|
||||||
|
if func_name in EXCLUDE_FROM_DEDUP:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Normalize code for comparison (remove whitespace variations)
|
||||||
|
normalized = re.sub(r'\s+', ' ', code).strip()
|
||||||
|
|
||||||
|
if normalized in seen:
|
||||||
|
duplicates.append(func_name)
|
||||||
|
print(f" ⚠️ Duplicate found: {func_name} (also defined as {seen[normalized]})")
|
||||||
|
else:
|
||||||
|
seen[normalized] = func_name
|
||||||
|
|
||||||
|
return duplicates
|
||||||
|
|
||||||
|
def create_section_header(title: str) -> str:
|
||||||
|
"""Create a formatted section header"""
|
||||||
|
return f"""
|
||||||
|
# ==============================================================================
|
||||||
|
# {title}
|
||||||
|
# ==============================================================================
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_function_group(func_name: str) -> str:
|
||||||
|
"""Determine which group a function belongs to"""
|
||||||
|
for group_key, group_data in FUNCTION_GROUPS.items():
|
||||||
|
if func_name in group_data["functions"]:
|
||||||
|
return group_key
|
||||||
|
return "UNKNOWN"
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# MAIN OPTIMIZATION LOGIC
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
def optimize_build_func(input_file: Path, output_file: Path):
|
||||||
|
"""Main optimization function"""
|
||||||
|
|
||||||
|
print("=" * 80)
|
||||||
|
print("BUILD.FUNC OPTIMIZER")
|
||||||
|
print("=" * 80)
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Read input file
|
||||||
|
print(f"📖 Reading: {input_file}")
|
||||||
|
content = input_file.read_text(encoding='utf-8')
|
||||||
|
original_lines = len(content.split('\n'))
|
||||||
|
print(f" Lines: {original_lines:,}")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Extract functions
|
||||||
|
print("🔍 Extracting functions...")
|
||||||
|
functions = extract_functions(content)
|
||||||
|
print(f" Found {len(functions)} functions")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Find duplicates
|
||||||
|
print("🔎 Checking for duplicates...")
|
||||||
|
duplicates = find_duplicate_functions(functions)
|
||||||
|
if duplicates:
|
||||||
|
print(f" Found {len(duplicates)} duplicate(s)")
|
||||||
|
else:
|
||||||
|
print(" ✓ No duplicates found")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Extract header (copyright, etc)
|
||||||
|
print("📝 Extracting file header...")
|
||||||
|
lines = content.split('\n')
|
||||||
|
header_lines = []
|
||||||
|
|
||||||
|
# Extract only the first copyright block
|
||||||
|
in_header = True
|
||||||
|
for i, line in enumerate(lines):
|
||||||
|
if in_header:
|
||||||
|
# Keep copyright and license lines
|
||||||
|
if (line.strip().startswith('#!') or
|
||||||
|
line.strip().startswith('# Copyright') or
|
||||||
|
line.strip().startswith('# Author:') or
|
||||||
|
line.strip().startswith('# License:') or
|
||||||
|
line.strip().startswith('# Revision:') or
|
||||||
|
line.strip() == ''):
|
||||||
|
header_lines.append(line)
|
||||||
|
else:
|
||||||
|
in_header = False
|
||||||
|
break
|
||||||
|
|
||||||
|
# Remove trailing empty lines
|
||||||
|
while header_lines and header_lines[-1].strip() == '':
|
||||||
|
header_lines.pop()
|
||||||
|
|
||||||
|
header = '\n'.join(header_lines)
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Build optimized content
|
||||||
|
print("🔨 Building optimized structure...")
|
||||||
|
|
||||||
|
optimized_parts = [header]
|
||||||
|
|
||||||
|
# Group functions
|
||||||
|
grouped_functions = {key: [] for key in FUNCTION_GROUPS.keys()}
|
||||||
|
grouped_functions["UNKNOWN"] = []
|
||||||
|
|
||||||
|
for func_name, (func_code, start, end) in functions.items():
|
||||||
|
if func_name in duplicates:
|
||||||
|
continue # Skip duplicates
|
||||||
|
|
||||||
|
group = get_function_group(func_name)
|
||||||
|
|
||||||
|
# Extract comments before function
|
||||||
|
comments = extract_header_comments(content, func_name, func_code)
|
||||||
|
|
||||||
|
grouped_functions[group].append((func_name, comments + func_code))
|
||||||
|
|
||||||
|
# Add grouped sections
|
||||||
|
for group_key, group_data in FUNCTION_GROUPS.items():
|
||||||
|
if grouped_functions[group_key]:
|
||||||
|
optimized_parts.append(create_section_header(group_data["title"]))
|
||||||
|
|
||||||
|
for func_name, func_code in grouped_functions[group_key]:
|
||||||
|
optimized_parts.append(func_code)
|
||||||
|
optimized_parts.append('') # Empty line between functions
|
||||||
|
|
||||||
|
# Add unknown functions at the end
|
||||||
|
if grouped_functions["UNKNOWN"]:
|
||||||
|
optimized_parts.append(create_section_header("UNCATEGORIZED FUNCTIONS"))
|
||||||
|
print(f" ⚠️ {len(grouped_functions['UNKNOWN'])} uncategorized functions:")
|
||||||
|
for func_name, func_code in grouped_functions["UNKNOWN"]:
|
||||||
|
print(f" - {func_name}")
|
||||||
|
optimized_parts.append(func_code)
|
||||||
|
optimized_parts.append('')
|
||||||
|
|
||||||
|
# Add any remaining non-function code (bootstrap, source commands, traps, etc)
|
||||||
|
print("📌 Adding remaining code...")
|
||||||
|
|
||||||
|
# Extract bootstrap/source section
|
||||||
|
bootstrap_lines = []
|
||||||
|
trap_lines = []
|
||||||
|
other_lines = []
|
||||||
|
|
||||||
|
in_function = False
|
||||||
|
brace_count = 0
|
||||||
|
in_bootstrap_comment = False
|
||||||
|
|
||||||
|
for line in lines:
|
||||||
|
stripped = line.strip()
|
||||||
|
|
||||||
|
# Skip the header we already extracted
|
||||||
|
if (stripped.startswith('#!/usr/bin/env bash') or
|
||||||
|
stripped.startswith('# Copyright') or
|
||||||
|
stripped.startswith('# Author:') or
|
||||||
|
stripped.startswith('# License:') or
|
||||||
|
stripped.startswith('# Revision:')):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if we're in a function
|
||||||
|
if re.match(r'^[a-zA-Z_][a-zA-Z0-9_]*\s*\(\)\s*\{', line):
|
||||||
|
in_function = True
|
||||||
|
brace_count = 1
|
||||||
|
elif in_function:
|
||||||
|
brace_count += line.count('{') - line.count('}')
|
||||||
|
if brace_count == 0:
|
||||||
|
in_function = False
|
||||||
|
elif not in_function:
|
||||||
|
# Collect non-function lines
|
||||||
|
|
||||||
|
# Bootstrap/loader section
|
||||||
|
if ('Community-Scripts bootstrap' in line or
|
||||||
|
'Load core' in line or
|
||||||
|
in_bootstrap_comment):
|
||||||
|
bootstrap_lines.append(line)
|
||||||
|
if '# ---' in line or '# ===' in line:
|
||||||
|
in_bootstrap_comment = not in_bootstrap_comment
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Source commands
|
||||||
|
if (stripped.startswith('source <(') or
|
||||||
|
stripped.startswith('if command -v curl') or
|
||||||
|
stripped.startswith('elif command -v wget') or
|
||||||
|
'load_functions' in stripped or
|
||||||
|
'catch_errors' in stripped):
|
||||||
|
bootstrap_lines.append(line)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Traps
|
||||||
|
if stripped.startswith('trap '):
|
||||||
|
trap_lines.append(line)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# VAR_WHITELIST declaration
|
||||||
|
if 'declare -ag VAR_WHITELIST' in line or (other_lines and 'VAR_WHITELIST' in other_lines[-1]):
|
||||||
|
other_lines.append(line)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Empty lines between sections - keep some
|
||||||
|
if stripped == '' and (bootstrap_lines or trap_lines or other_lines):
|
||||||
|
if bootstrap_lines and bootstrap_lines[-1].strip() != '':
|
||||||
|
bootstrap_lines.append(line)
|
||||||
|
elif trap_lines and trap_lines[-1].strip() != '':
|
||||||
|
trap_lines.append(line)
|
||||||
|
|
||||||
|
# Add bootstrap section if exists
|
||||||
|
if bootstrap_lines:
|
||||||
|
optimized_parts.append(create_section_header("DEPENDENCY LOADING"))
|
||||||
|
optimized_parts.extend(bootstrap_lines)
|
||||||
|
optimized_parts.append('')
|
||||||
|
|
||||||
|
# Add other declarations
|
||||||
|
if other_lines:
|
||||||
|
optimized_parts.extend(other_lines)
|
||||||
|
optimized_parts.append('')
|
||||||
|
|
||||||
|
# Write output
|
||||||
|
optimized_content = '\n'.join(optimized_parts)
|
||||||
|
optimized_lines = len(optimized_content.split('\n'))
|
||||||
|
|
||||||
|
print()
|
||||||
|
print(f"💾 Writing optimized file: {output_file}")
|
||||||
|
output_file.write_text(optimized_content, encoding='utf-8')
|
||||||
|
|
||||||
|
print()
|
||||||
|
print("=" * 80)
|
||||||
|
print("✅ OPTIMIZATION COMPLETE")
|
||||||
|
print("=" * 80)
|
||||||
|
print(f"Original lines: {original_lines:,}")
|
||||||
|
print(f"Optimized lines: {optimized_lines:,}")
|
||||||
|
print(f"Difference: {original_lines - optimized_lines:+,}")
|
||||||
|
print(f"Functions: {len(functions) - len(duplicates)}")
|
||||||
|
print(f"Duplicates removed: {len(duplicates)}")
|
||||||
|
print()
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# ENTRY POINT
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main entry point"""
|
||||||
|
|
||||||
|
# Set paths
|
||||||
|
script_dir = Path(__file__).parent
|
||||||
|
input_file = script_dir / "build.func"
|
||||||
|
|
||||||
|
# Create backup first
|
||||||
|
timestamp = datetime.now().strftime("%Y%m%d-%H%M%S")
|
||||||
|
backup_file = script_dir / f"build.func.backup-{timestamp}"
|
||||||
|
|
||||||
|
if not input_file.exists():
|
||||||
|
print(f"❌ Error: {input_file} not found!")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
print(f"📦 Creating backup: {backup_file.name}")
|
||||||
|
backup_file.write_text(input_file.read_text(encoding='utf-8'), encoding='utf-8')
|
||||||
|
print()
|
||||||
|
|
||||||
|
# Optimize
|
||||||
|
output_file = script_dir / "build.func.optimized"
|
||||||
|
optimize_build_func(input_file, output_file)
|
||||||
|
|
||||||
|
print("📋 Next steps:")
|
||||||
|
print(f" 1. Review: {output_file.name}")
|
||||||
|
print(f" 2. Test the optimized version")
|
||||||
|
print(f" 3. If OK: mv build.func.optimized build.func")
|
||||||
|
print(f" 4. Backup available at: {backup_file.name}")
|
||||||
|
print()
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
Loading…
x
Reference in New Issue
Block a user