Compare commits
38 commits
123-api-cg
...
master
Author | SHA1 | Date | |
---|---|---|---|
dc59211634 | |||
649a426c25 | |||
69b8512d6b | |||
82f7a7233d | |||
f4f56e53a8 | |||
03f8b86877 | |||
![]() |
045e157266 | ||
8d9f913cbc | |||
![]() |
05b1963a34 | ||
8fa9076f91 | |||
bd0a9865a1 | |||
9a7d6b9084 | |||
946e83fd15 | |||
![]() |
362e0f7a6f | ||
![]() |
e396b0c01e | ||
![]() |
a58dcdafb1 | ||
037827669a | |||
![]() |
31f5810bef | ||
![]() |
8aeb802ec8 | ||
![]() |
0ecf214e26 | ||
![]() |
3878990c4e | ||
![]() |
df0fb32592 | ||
![]() |
b80db52ea5 | ||
![]() |
8401cc51fa | ||
![]() |
cc538b719f | ||
![]() |
fe797ebeeb | ||
![]() |
871d9d735e | ||
![]() |
bce317a554 | ||
![]() |
6c7e8c46d6 | ||
![]() |
c27b99ff52 | ||
![]() |
e02bab244d | ||
![]() |
db3e0d1273 | ||
![]() |
fd59ab9e26 | ||
![]() |
d2ab856d64 | ||
![]() |
c64bbf4a70 | ||
![]() |
021e5862ff | ||
![]() |
e305d486f2 | ||
![]() |
b77687ea14 |
353 changed files with 7214 additions and 78703 deletions
6
.gitignore
vendored
6
.gitignore
vendored
|
@ -112,4 +112,8 @@ install_plugin_cad.sh
|
|||
*#
|
||||
.#*
|
||||
\#*\#
|
||||
out/
|
||||
out/
|
||||
|
||||
#freecad_workbench
|
||||
freecad_workbench/freecad/update_workbench.sh
|
||||
*.FCBak
|
10
.gitmodules
vendored
10
.gitmodules
vendored
|
@ -1,3 +1,9 @@
|
|||
[submodule "insertion_vector_predicate/assembly"]
|
||||
path = insertion_vector_predicate/assembly
|
||||
[submodule "rcg_pipeline"]
|
||||
path = rcg_pipeline
|
||||
url = https://gitlab.com/robossembler/rcg-pipeline.git
|
||||
[submodule "freecad_workbench"]
|
||||
path = freecad_workbench
|
||||
url = https://gitlab.com/robossembler/robossembler-freecad-workbench.git
|
||||
[submodule "simulation/insertion_vector_predicate/assembly"]
|
||||
path = simulation/insertion_vector_predicate/assembly
|
||||
url = https://github.com/yunshengtian/Assemble-Them-All
|
||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -1,105 +0,0 @@
|
|||
# Инструкция для запуска
|
||||
|
||||
Должен быть установлен пакет [BlenderProc](https://github.com/DLR-RM/BlenderProc)
|
||||
|
||||
## Создание датасета в формате YoloV4 для заданного объекта
|
||||
|
||||
Команда для запуска:
|
||||
|
||||
```
|
||||
blenderproc run obj2Yolov4dataset.py [obj] [output_dir] [--imgs 1]
|
||||
```
|
||||
- obj: файл описания объекта *.obj
|
||||
- output_dir: выходной каталог
|
||||
- --imgs 1: количество изображений на выходе
|
||||
|
||||
## Создание датасета в формате YoloV4 для серии заданных объектов в заданной сцене
|
||||
|
||||
Команда для запуска:
|
||||
```
|
||||
blenderproc run objs2Yolov4dataset.py [scene] [obj_path] [output_dir] [vhacd_path] [--imgs 1]
|
||||
```
|
||||
- scene: путь к файлу описания сцены (*.blend)
|
||||
- obj_path: путь к каталогу с файлами описания детектируемых объектов *.obj
|
||||
- output_dir: выходной каталог
|
||||
- vhacd_path: каталог, в котором должен быть установлен или уже установлен vhacd (по умолчанию blenderproc_resources/vhacd)
|
||||
- --imgs 1: количество серий рендеринга (по 15 изображений каждая) на выходе (например, если imgs=100, то будет получено 1500 изображений)
|
||||
|
||||
Файл описания сцены обязательно должен содержать плоскость (с именем 'floor'), на которую будут сэмплированы объекты для обнаружения.
|
||||
|
||||
Должен быть собран пакет [darknet](https://github.com/AlexeyAB/darknet) для работы на заданном ПО и оборудовании (CPU, GPU ...)
|
||||
|
||||
---
|
||||
|
||||
## Обучение нейросети и получение файла с её весами
|
||||
|
||||
Команда для запуска:
|
||||
```
|
||||
darknet detector train [data] [cfg] [weight]
|
||||
```
|
||||
- data: файл с описанием датасета (*.data)
|
||||
- cfg: файл с описанием нейросети
|
||||
- weight: файл весов нейросети
|
||||
|
||||
Для обучения нужно загрузить файл с предобученными весами (162 MB): [yolov4.conv.137](https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v3_optimal/yolov4.conv.137)
|
||||
Для разного количества детектируемых объектов в выборке нужны свои файлы [data](https://gitlab.com/robossembler/framework/-/blob/master/ObjectDetection/yolov4_objs2.data) и [cfg](https://gitlab.com/robossembler/framework/-/blob/master/ObjectDetection/yolov4_objs2.cfg).
|
||||
|
||||
---
|
||||
|
||||
## Команда для обнаружения объектов нейросетью с обученными весами
|
||||
* вариант 1 (в файле t.txt - список изображений):
|
||||
```
|
||||
darknet detector test yolov4_objs2.data yolov4_test.cfg yolov4_objs2_final.weights -dont_show -ext_output < t.txt > res.txt
|
||||
```
|
||||
|
||||
* вариант 2 (файл 000015.jpg - тестовое изображение):
|
||||
```
|
||||
darknet detector test yolov4_objs2.data yolov4_test.cfg yolov4_objs2_final.weights -dont_show -ext_output 000015.jpg > res.txt
|
||||
```
|
||||
* вариант 3 (в файле t.txt - список изображений):
|
||||
```
|
||||
darknet detector test yolov4_objs2.data yolov4_test.cfg yolov4_objs2_final.weights -dont_show -ext_output -out res.json < t.txt
|
||||
```
|
||||
|
||||
Файл res.txt после запуска варианта 2:
|
||||
|
||||
> net.optimized_memory = 0
|
||||
> mini_batch = 1, batch = 1, time_steps = 1, train = 0
|
||||
> Create CUDA-stream - 0
|
||||
> Create cudnn-handle 0
|
||||
> nms_kind: greedynms (1), beta = 0.600000
|
||||
> nms_kind: greedynms (1), beta = 0.600000
|
||||
> nms_kind: greedynms (1), beta = 0.600000
|
||||
>
|
||||
> seen 64, trained: 768 K-images (12 Kilo-batches_64)
|
||||
> Detection layer: 139 - type = 28
|
||||
> Detection layer: 150 - type = 28
|
||||
> Detection layer: 161 - type = 28
|
||||
>000015.jpg: Predicted in 620.357000 milli-seconds.
|
||||
>fork.001: 94% (left_x: 145 top_y: -0 width: 38 height: 18)
|
||||
>asm_element_edge.001: 28% (left_x: 195 top_y: 320 width: 40 height: 61)
|
||||
>start_link.001: 87% (left_x: 197 top_y: 313 width: 39 height: 68)
|
||||
>doking_link.001: 99% (left_x: 290 top_y: 220 width: 32 height: 21)
|
||||
>start_link.001: 90% (left_x: 342 top_y: 198 width: 33 height: 34)
|
||||
>doking_link.001: 80% (left_x: 342 top_y: 198 width: 32 height: 34)
|
||||
>assemb_link.001: 100% (left_x: 426 top_y: 410 width: 45 height: 61)
|
||||
|
||||
|
||||
Файл res.json после запуска варианта 3:
|
||||
>[
|
||||
{
|
||||
"frame_id":1,
|
||||
"filename":"img_test/000001.jpg",
|
||||
"objects": [
|
||||
{"class_id":5, "name":"asm_element_edge.001", "relative_coordinates":{"center_x":0.498933, "center_y":0.502946, "width":0.083075, "height":0.073736}, "confidence":0.999638},
|
||||
{"class_id":4, "name":"grip-tool.001", "relative_coordinates":{"center_x":0.858856, "center_y":0.031339, "width":0.043919, "height":0.064563}, "confidence":0.996551}
|
||||
]
|
||||
},
|
||||
{
|
||||
"frame_id":2,
|
||||
"filename":"img_test/000002.jpg",
|
||||
"objects": [
|
||||
{"class_id":1, "name":"start_link.001", "relative_coordinates":{"center_x":0.926026, "center_y":0.728457, "width":0.104029, "height":0.132757}, "confidence":0.995811},
|
||||
{"class_id":0, "name":"assemb_link.001", "relative_coordinates":{"center_x":0.280403, "center_y":0.129059, "width":0.029980, "height":0.025067}, "confidence":0.916782}
|
||||
]
|
||||
}
|
|
@ -1,144 +0,0 @@
|
|||
import blenderproc as bproc
|
||||
"""
|
||||
obj2Yolov4dataset
|
||||
Общая задача: обнаружение объекта (Object detection)
|
||||
Реализуемая функция: создание датасета в формате YoloV4 для заданного объекта (*.obj)
|
||||
Используется модуль blenderproc
|
||||
|
||||
24.01.2023 @shalenikol release 0.1
|
||||
22.02.2023 @shalenikol release 0.2 исправлен расчёт x,y в convert2relative
|
||||
"""
|
||||
import numpy as np
|
||||
import argparse
|
||||
import random
|
||||
import os
|
||||
import shutil
|
||||
import json
|
||||
|
||||
def convert2relative(height, width, bbox):
|
||||
"""
|
||||
YOLO format use relative coordinates for annotation
|
||||
"""
|
||||
x, y, w, h = bbox
|
||||
x += w/2
|
||||
y += h/2
|
||||
return x/width, y/height, w/width, h/height
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('scene', nargs='?', default="resources/robossembler-asset.obj", help="Path to the object file.")
|
||||
parser.add_argument('output_dir', nargs='?', default="output", help="Path to where the final files, will be saved")
|
||||
parser.add_argument('--imgs', default=1, type=int, help="The number of times the objects should be rendered.")
|
||||
args = parser.parse_args()
|
||||
|
||||
if not os.path.isdir(args.output_dir):
|
||||
os.mkdir(args.output_dir)
|
||||
|
||||
bproc.init()
|
||||
|
||||
# load the objects into the scene
|
||||
obj = bproc.loader.load_obj(args.scene)[0]
|
||||
obj.set_cp("category_id", 1)
|
||||
|
||||
# Randomly perturbate the material of the object
|
||||
mat = obj.get_materials()[0]
|
||||
mat.set_principled_shader_value("Specular", random.uniform(0, 1))
|
||||
mat.set_principled_shader_value("Roughness", random.uniform(0, 1))
|
||||
mat.set_principled_shader_value("Base Color", np.random.uniform([0, 0, 0, 1], [1, 1, 1, 1]))
|
||||
mat.set_principled_shader_value("Metallic", random.uniform(0, 1))
|
||||
|
||||
# Create a new light
|
||||
light = bproc.types.Light()
|
||||
light.set_type("POINT")
|
||||
# Sample its location around the object
|
||||
light.set_location(bproc.sampler.shell(
|
||||
center=obj.get_location(),
|
||||
radius_min=1,
|
||||
radius_max=5,
|
||||
elevation_min=1,
|
||||
elevation_max=89
|
||||
))
|
||||
# Randomly set the color and energy
|
||||
light.set_color(np.random.uniform([0.5, 0.5, 0.5], [1, 1, 1]))
|
||||
light.set_energy(random.uniform(100, 1000))
|
||||
|
||||
bproc.camera.set_resolution(640, 480)
|
||||
|
||||
# Sample five camera poses
|
||||
poses = 0
|
||||
tries = 0
|
||||
while tries < 10000 and poses < args.imgs:
|
||||
# Sample random camera location around the object
|
||||
location = bproc.sampler.shell(
|
||||
center=obj.get_location(),
|
||||
radius_min=1,
|
||||
radius_max=4,
|
||||
elevation_min=1,
|
||||
elevation_max=89
|
||||
)
|
||||
# Compute rotation based lookat point which is placed randomly around the object
|
||||
lookat_point = obj.get_location() + np.random.uniform([-0.5, -0.5, -0.5], [0.5, 0.5, 0.5])
|
||||
rotation_matrix = bproc.camera.rotation_from_forward_vec(lookat_point - location, inplane_rot=np.random.uniform(-0.7854, 0.7854))
|
||||
# Add homog cam pose based on location an rotation
|
||||
cam2world_matrix = bproc.math.build_transformation_mat(location, rotation_matrix)
|
||||
|
||||
# Only add camera pose if object is still visible
|
||||
if obj in bproc.camera.visible_objects(cam2world_matrix):
|
||||
bproc.camera.add_camera_pose(cam2world_matrix)
|
||||
poses += 1
|
||||
tries += 1
|
||||
|
||||
# Enable transparency so the background becomes transparent
|
||||
bproc.renderer.set_output_format(enable_transparency=True)
|
||||
# add segmentation masks (per class and per instance)
|
||||
bproc.renderer.enable_segmentation_output(map_by=["category_id", "instance", "name"])
|
||||
|
||||
# Render RGB images
|
||||
data = bproc.renderer.render()
|
||||
|
||||
# Write data to coco file
|
||||
res_dir = os.path.join(args.output_dir, 'coco_data')
|
||||
bproc.writer.write_coco_annotations(res_dir,
|
||||
instance_segmaps=data["instance_segmaps"],
|
||||
instance_attribute_maps=data["instance_attribute_maps"],
|
||||
color_file_format='JPEG',
|
||||
colors=data["colors"],
|
||||
append_to_existing_output=True)
|
||||
|
||||
#загрузим аннотацию
|
||||
with open(os.path.join(res_dir,"coco_annotations.json"), "r") as fh:
|
||||
y = json.load(fh)
|
||||
|
||||
# список имен объектов
|
||||
with open(os.path.join(res_dir,"obj.names"), "w") as fh:
|
||||
for cat in y["categories"]:
|
||||
fh.write(cat["name"]+"\n")
|
||||
|
||||
# содадим или очистим папку data для датасета
|
||||
res_data = os.path.join(res_dir, 'data')
|
||||
if os.path.isdir(res_data):
|
||||
for f in os.listdir(res_data):
|
||||
os.remove(os.path.join(res_data, f))
|
||||
else:
|
||||
os.mkdir(res_data)
|
||||
|
||||
# список имен файлов с изображениями
|
||||
s = []
|
||||
with open(os.path.join(res_dir,"images.txt"), "w") as fh:
|
||||
for i in y["images"]:
|
||||
filename = i["file_name"]
|
||||
shutil.copy(os.path.join(res_dir,filename),res_data)
|
||||
fh.write(filename.replace('images','data')+"\n")
|
||||
s.append((os.path.split(filename))[1])
|
||||
|
||||
# предполагается, что "images" и "annotations" следуют в одном и том же порядке
|
||||
c = 0
|
||||
for i in y["annotations"]:
|
||||
bbox = i["bbox"]
|
||||
im_h = i["height"]
|
||||
im_w = i["width"]
|
||||
rel = convert2relative(im_h,im_w,bbox)
|
||||
fn = (os.path.splitext(s[c]))[0] # только имя файла
|
||||
with open(os.path.join(res_data,fn+".txt"), "w") as fh:
|
||||
# формат: <target> <x-center> <y-center> <width> <height>
|
||||
fh.write("0 "+'{:-f} {:-f} {:-f} {:-f}'.format(rel[0],rel[1],rel[2],rel[3])+"\n")
|
||||
c += 1
|
|
@ -1,296 +0,0 @@
|
|||
import blenderproc as bproc
|
||||
"""
|
||||
objs2Yolov4dataset
|
||||
Общая задача: обнаружение объекта (Object detection)
|
||||
Реализуемая функция: создание датасета в формате YoloV4 для серии заданных объектов (*.obj) в заданной сцене (*.blend)
|
||||
Используется модуль blenderproc
|
||||
|
||||
17.02.2023 @shalenikol release 0.1
|
||||
22.02.2023 @shalenikol release 0.2 исправлен расчёт x,y в convert2relative
|
||||
"""
|
||||
import sys
|
||||
import numpy as np
|
||||
import argparse
|
||||
import random
|
||||
import os
|
||||
import shutil
|
||||
import json
|
||||
|
||||
def convert2relative(height, width, bbox):
|
||||
"""
|
||||
YOLO format use relative coordinates for annotation
|
||||
"""
|
||||
x, y, w, h = bbox
|
||||
x += w/2
|
||||
y += h/2
|
||||
return x/width, y/height, w/width, h/height
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('scene', nargs='?', default="resources/sklad.blend", help="Path to the scene object.")
|
||||
parser.add_argument('obj_path', nargs='?', default="resources/in_obj", help="Path to the object files.")
|
||||
parser.add_argument('output_dir', nargs='?', default="output", help="Path to where the final files, will be saved")
|
||||
parser.add_argument('vhacd_path', nargs='?', default="blenderproc_resources/vhacd", help="The directory in which vhacd should be installed or is already installed.")
|
||||
parser.add_argument('--imgs', default=2, type=int, help="The number of times the objects should be rendered.")
|
||||
args = parser.parse_args()
|
||||
|
||||
if not os.path.isdir(args.obj_path):
|
||||
print(f"{args.obj_path} : no object directory")
|
||||
sys.exit()
|
||||
|
||||
if not os.path.isdir(args.output_dir):
|
||||
os.mkdir(args.output_dir)
|
||||
|
||||
bproc.init()
|
||||
|
||||
# ? загрузим свет из сцены
|
||||
#cam = bproc.loader.load_blend(args.scene, data_blocks=["cameras"])
|
||||
#lights = bproc.loader.load_blend(args.scene, data_blocks=["lights"])
|
||||
|
||||
# загрузим объекты
|
||||
list_files = os.listdir(args.obj_path)
|
||||
meshs = []
|
||||
i = 0
|
||||
for f in list_files:
|
||||
if (os.path.splitext(f))[1] == ".obj":
|
||||
f = os.path.join(args.obj_path, f) # путь к файлу объекта
|
||||
if os.path.isfile(f):
|
||||
meshs += bproc.loader.load_obj(f)
|
||||
i += 1
|
||||
|
||||
if i == 0:
|
||||
print("Objects not found")
|
||||
sys.exit()
|
||||
|
||||
for i,o in enumerate(meshs):
|
||||
o.set_cp("category_id", i+1)
|
||||
|
||||
# загрузим сцену
|
||||
scene = bproc.loader.load_blend(args.scene, data_blocks=["objects"])
|
||||
#scene = bproc.loader.load_obj(args.scene)
|
||||
|
||||
# найдём пол
|
||||
floor = None
|
||||
for o in scene:
|
||||
o.set_cp("category_id", 999)
|
||||
s = o.get_name()
|
||||
if s.find("floor") >= 0:
|
||||
floor = o
|
||||
if floor == None:
|
||||
print("Floor not found in the scene")
|
||||
sys.exit()
|
||||
|
||||
floor.enable_rigidbody(False, collision_shape='BOX')
|
||||
|
||||
objs = meshs + scene
|
||||
|
||||
for obj in meshs:
|
||||
# Make the object actively participate in the physics simulation
|
||||
obj.enable_rigidbody(active=True, collision_shape="COMPOUND")
|
||||
# Also use convex decomposition as collision shapes
|
||||
obj.build_convex_decomposition_collision_shape(args.vhacd_path)
|
||||
|
||||
with open(os.path.join(args.output_dir,"res.txt"), "w") as fh:
|
||||
# fh.write(str(type(scene[0]))+"\n")
|
||||
i = 0
|
||||
for o in objs:
|
||||
i += 1
|
||||
loc = o.get_location()
|
||||
euler = o.get_rotation_euler()
|
||||
fh.write(f"{i} : {o.get_name()} {loc} {euler}\n")
|
||||
|
||||
# define a light and set its location and energy level
|
||||
light = bproc.types.Light()
|
||||
light.set_type("POINT")
|
||||
light.set_location([5, -5, 5])
|
||||
#light.set_energy(900)
|
||||
#light.set_color([0.7, 0.7, 0.7])
|
||||
|
||||
light1 = bproc.types.Light(name="light1")
|
||||
light1.set_type("SUN")
|
||||
light1.set_location([0, 0, 0])
|
||||
light1.set_rotation_euler([-0.063, 0.6177, -0.1985])
|
||||
#light1.set_energy(7)
|
||||
light1.set_color([1, 1, 1])
|
||||
"""
|
||||
# Sample its location around the object
|
||||
light.set_location(bproc.sampler.shell(
|
||||
center=obj.get_location(),
|
||||
radius_min=2.5,
|
||||
radius_max=5,
|
||||
elevation_min=1,
|
||||
elevation_max=89
|
||||
))
|
||||
"""
|
||||
|
||||
# define the camera intrinsics
|
||||
bproc.camera.set_intrinsics_from_blender_params(1, 640, 480, lens_unit="FOV")
|
||||
bproc.renderer.enable_segmentation_output(map_by=["category_id", "instance", "name"])
|
||||
|
||||
res_dir = os.path.join(args.output_dir, 'coco_data')
|
||||
# Цикл рендеринга
|
||||
n_cam_location = 5 # количество случайных локаций камеры
|
||||
n_cam_poses = 3 # количество сэмплов для каждой локации камеры
|
||||
# Do multiple times: Position the shapenet objects using the physics simulator and render X images with random camera poses
|
||||
for r in range(args.imgs):
|
||||
# Randomly set the color and energy
|
||||
light.set_color(np.random.uniform([0.5, 0.5, 0.5], [1, 1, 1]))
|
||||
light.set_energy(random.uniform(500, 1000))
|
||||
light1.set_energy(random.uniform(3, 11))
|
||||
|
||||
for i,o in enumerate(objs):
|
||||
mat = o.get_materials()[0]
|
||||
mat.set_principled_shader_value("Specular", random.uniform(0, 1))
|
||||
mat.set_principled_shader_value("Roughness", random.uniform(0, 1))
|
||||
mat.set_principled_shader_value("Base Color", np.random.uniform([0, 0, 0, 1], [1, 1, 1, 1]))
|
||||
mat.set_principled_shader_value("Metallic", random.uniform(0, 1))
|
||||
|
||||
# Clear all key frames from the previous run
|
||||
bproc.utility.reset_keyframes()
|
||||
|
||||
# Define a function that samples 6-DoF poses
|
||||
def sample_pose(obj: bproc.types.MeshObject):
|
||||
obj.set_location(np.random.uniform([-1, -1.5, 0.2], [1, 2, 1.2])) #[-1, -1, 0], [1, 1, 2]))
|
||||
obj.set_rotation_euler(bproc.sampler.uniformSO3())
|
||||
|
||||
# Sample the poses of all shapenet objects above the ground without any collisions in-between
|
||||
bproc.object.sample_poses(meshs, objects_to_check_collisions = meshs + [floor], sample_pose_func = sample_pose)
|
||||
|
||||
# Run the simulation and fix the poses of the shapenet objects at the end
|
||||
bproc.object.simulate_physics_and_fix_final_poses(min_simulation_time=4, max_simulation_time=20, check_object_interval=1)
|
||||
|
||||
# Find point of interest, all cam poses should look towards it
|
||||
poi = bproc.object.compute_poi(meshs)
|
||||
|
||||
coord_max = [0.1, 0.1, 0.1]
|
||||
coord_min = [0., 0., 0.]
|
||||
|
||||
with open(os.path.join(args.output_dir,"res.txt"), "a") as fh:
|
||||
fh.write("*****************\n")
|
||||
fh.write(f"{r}) poi = {poi}\n")
|
||||
i = 0
|
||||
for o in meshs:
|
||||
i += 1
|
||||
loc = o.get_location()
|
||||
euler = o.get_rotation_euler()
|
||||
fh.write(f" {i} : {o.get_name()} {loc} {euler}\n")
|
||||
for j in range(3):
|
||||
if loc[j] < coord_min[j]:
|
||||
coord_min[j] = loc[j]
|
||||
if loc[j] > coord_max[j]:
|
||||
coord_max[j] = loc[j]
|
||||
|
||||
# Sample up to X camera poses
|
||||
#an = np.random.uniform(0.78, 1.2) #1. #0.35
|
||||
for i in range(n_cam_location):
|
||||
# Sample location
|
||||
location = bproc.sampler.shell(center=[0, 0, 0],
|
||||
radius_min=1.1,
|
||||
radius_max=3.3,
|
||||
elevation_min=5,
|
||||
elevation_max=89)
|
||||
# координата, по которой будем сэмплировать положение камеры
|
||||
j = random.randint(0, 2)
|
||||
# разовый сдвиг по случайной координате
|
||||
d = (coord_max[j] - coord_min[j]) / n_cam_poses
|
||||
if location[j] < 0:
|
||||
d = -d
|
||||
for k in range(n_cam_poses):
|
||||
# Compute rotation based on vector going from location towards poi
|
||||
rotation_matrix = bproc.camera.rotation_from_forward_vec(poi - location, inplane_rot=np.random.uniform(-0.7854, 0.7854))
|
||||
# Add homog cam pose based on location an rotation
|
||||
cam2world_matrix = bproc.math.build_transformation_mat(location, rotation_matrix)
|
||||
bproc.camera.add_camera_pose(cam2world_matrix)
|
||||
location[j] -= d
|
||||
#world_matrix = bproc.math.build_transformation_mat([2.3, -0.4, 0.66], [1.396, 0., an])
|
||||
#bproc.camera.add_camera_pose(world_matrix)
|
||||
#an += 0.2
|
||||
|
||||
# render the whole pipeline
|
||||
data = bproc.renderer.render()
|
||||
|
||||
# Write data to coco file
|
||||
bproc.writer.write_coco_annotations(res_dir,
|
||||
instance_segmaps=data["instance_segmaps"],
|
||||
instance_attribute_maps=data["instance_attribute_maps"],
|
||||
color_file_format='JPEG',
|
||||
colors=data["colors"],
|
||||
append_to_existing_output=True)
|
||||
|
||||
#загрузим аннотацию
|
||||
with open(os.path.join(res_dir,"coco_annotations.json"), "r") as fh:
|
||||
y = json.load(fh)
|
||||
|
||||
# список имен объектов
|
||||
n_obj = 0
|
||||
obj_list = []
|
||||
with open(os.path.join(res_dir,"obj.names"), "w") as fh:
|
||||
for cat in y["categories"]:
|
||||
if cat["id"] < 999:
|
||||
n = cat["name"]
|
||||
i = cat["id"]
|
||||
obj_list.append([n,i,n_obj])
|
||||
fh.write(n+"\n")
|
||||
n_obj += 1
|
||||
|
||||
# содадим или очистим папку data для датасета
|
||||
res_data = os.path.join(res_dir, 'data')
|
||||
if os.path.isdir(res_data):
|
||||
for f in os.listdir(res_data):
|
||||
os.remove(os.path.join(res_data, f))
|
||||
else:
|
||||
os.mkdir(res_data)
|
||||
|
||||
# список имен файлов с изображениями
|
||||
fn_image = os.path.join(res_dir,"images.txt")
|
||||
img_list = []
|
||||
with open(fn_image, "w") as fh:
|
||||
for i in y["images"]:
|
||||
filename = i["file_name"]
|
||||
shutil.copy(os.path.join(res_dir,filename),res_data)
|
||||
fh.write(filename.replace('images','data')+"\n")
|
||||
img_list.append([i["id"], (os.path.split(filename))[1]])
|
||||
|
||||
# создадим 2 списка имен файлов для train и valid
|
||||
n_image_in_series = n_cam_location * n_cam_poses # количество изображений в серии
|
||||
i = 0
|
||||
fh = open(fn_image, "r")
|
||||
f1 = open(os.path.join(res_dir,"i_train.txt"), "w")
|
||||
f2 = open(os.path.join(res_dir,"i_val.txt"), "w")
|
||||
for line in fh:
|
||||
i += 1
|
||||
if i % n_image_in_series == 0:
|
||||
f2.write(line)
|
||||
else:
|
||||
f1.write(line)
|
||||
fh.close()
|
||||
f1.close()
|
||||
f2.close()
|
||||
|
||||
# заполним файлы с метками bbox
|
||||
for i in y["annotations"]:
|
||||
cat_id = i["category_id"]
|
||||
if cat_id < 999:
|
||||
im_id = i["image_id"]
|
||||
bbox = i["bbox"]
|
||||
im_h = i["height"]
|
||||
im_w = i["width"]
|
||||
rel = convert2relative(im_h,im_w,bbox)
|
||||
|
||||
# находим индекс списка с нужным изображением
|
||||
j = next(k for k, (x, _) in enumerate(img_list) if x == im_id)
|
||||
filename = img_list[j][1]
|
||||
fn = (os.path.splitext(filename))[0] # только имя файла
|
||||
with open(os.path.join(res_data,fn+".txt"), "a") as fh:
|
||||
# находим индекс списка с нужным объектом
|
||||
j = next(k for k, (_, x, _) in enumerate(obj_list) if x == cat_id)
|
||||
# формат: <target> <x-center> <y-center> <width> <height>
|
||||
fh.write(f"{obj_list[j][2]} {rel[0]} {rel[1]} {rel[2]} {rel[3]}\n")
|
||||
|
||||
# создадим файл описания датасета для darknet
|
||||
with open(os.path.join(res_dir,"yolov4_objs2.data"), "w") as fh:
|
||||
fh.write(f"classes = {n_obj}\n")
|
||||
fh.write("train = i_train.txt\n")
|
||||
fh.write("valid = i_val.txt\n")
|
||||
fh.write("names = obj.names\n")
|
||||
fh.write("backup = backup\n")
|
||||
fh.write("eval = coco\n")
|
File diff suppressed because it is too large
Load diff
|
@ -1,7 +0,0 @@
|
|||
classes= 1
|
||||
train = i_train.txt
|
||||
valid = i_val.txt
|
||||
names = obj.names
|
||||
backup = backup
|
||||
eval=coco
|
||||
|
File diff suppressed because it is too large
Load diff
|
@ -1,7 +0,0 @@
|
|||
classes= 6
|
||||
train = i_train.txt
|
||||
valid = i_val.txt
|
||||
names = obj.names
|
||||
backup = backup
|
||||
eval=coco
|
||||
|
File diff suppressed because it is too large
Load diff
|
@ -1,44 +0,0 @@
|
|||
---
|
||||
id: BOP_dataset
|
||||
title: script for create BOP dataset
|
||||
---
|
||||
|
||||
## Структура входных данных:
|
||||
```
|
||||
<example_dir>/
|
||||
input_obj/asm_element_edge.mtl # файл материала
|
||||
input_obj/asm_element_edge.obj # меш-объект
|
||||
input_obj/fork.mtl
|
||||
input_obj/fork.obj
|
||||
input_obj/...
|
||||
resources/sklad.blend # файл сцены
|
||||
objs2BOPdataset.py # этот скрипт
|
||||
```
|
||||
|
||||
## Пример команды запуска скрипта:
|
||||
```
|
||||
cd <example_dir>/
|
||||
blenderproc run objs2BOPdataset.py resources/sklad.blend input_obj output --imgs 333
|
||||
```
|
||||
- resources/sklad.blend : файл сцены
|
||||
- input_obj : каталог с меш-файлами
|
||||
- output : выходной каталог
|
||||
- imgs : количество пакетов по 9 кадров в каждом (в примере 333 * 9 = 2997)
|
||||
|
||||
## Структура BOP датасета на выходе:
|
||||
```
|
||||
output/
|
||||
bop_data/
|
||||
train_pbr/
|
||||
000000/
|
||||
depth/... # файлы глубины
|
||||
mask/... # файлы маски
|
||||
mask_visib/... # файлы маски видимости
|
||||
rgb/... # файлы изображений RGB
|
||||
scene_camera.json
|
||||
scene_gt.json
|
||||
scene_gt_coco.json
|
||||
scene_gt_info.json
|
||||
camera.json # внутренние параметры камеры (для всего датасета)
|
||||
res.txt # протокол создания пакетов датасета
|
||||
```
|
|
@ -1,261 +0,0 @@
|
|||
import blenderproc as bproc
|
||||
"""
|
||||
objs2BOPdataset
|
||||
Общая задача: распознавание 6D позы объекта (6D pose estimation)
|
||||
Реализуемая функция: создание датасета в формате BOP для серии заданных объектов (*.obj) в заданной сцене (*.blend)
|
||||
Используется модуль blenderproc
|
||||
|
||||
29.08.2023 @shalenikol release 0.1
|
||||
12.10.2023 @shalenikol release 0.2
|
||||
"""
|
||||
import sys
|
||||
import numpy as np
|
||||
import argparse
|
||||
import random
|
||||
import os
|
||||
import shutil
|
||||
import json
|
||||
|
||||
Not_Categories_Name = True # наименование категории в аннотации отсутствует
|
||||
|
||||
def convert2relative(height, width, bbox):
|
||||
"""
|
||||
YOLO format use relative coordinates for annotation
|
||||
"""
|
||||
x, y, w, h = bbox
|
||||
x += w/2
|
||||
y += h/2
|
||||
return x/width, y/height, w/width, h/height
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('scene', nargs='?', default="resources/sklad.blend", help="Path to the scene object.")
|
||||
parser.add_argument('obj_path', nargs='?', default="resources/in_obj", help="Path to the object files.")
|
||||
parser.add_argument('output_dir', nargs='?', default="output", help="Path to where the final files, will be saved")
|
||||
parser.add_argument('vhacd_path', nargs='?', default="blenderproc_resources/vhacd", help="The directory in which vhacd should be installed or is already installed.")
|
||||
parser.add_argument('-single_object', nargs='?', type= bool, default=True, help="One object per frame.")
|
||||
parser.add_argument('--imgs', default=2, type=int, help="The number of times the objects should be rendered.")
|
||||
args = parser.parse_args()
|
||||
|
||||
if not os.path.isdir(args.obj_path):
|
||||
print(f"{args.obj_path} : no object directory")
|
||||
sys.exit()
|
||||
|
||||
if not os.path.isdir(args.output_dir):
|
||||
os.mkdir(args.output_dir)
|
||||
|
||||
single_object = args.single_object
|
||||
|
||||
bproc.init()
|
||||
|
||||
# ? загрузим свет из сцены
|
||||
#cam = bproc.loader.load_blend(args.scene, data_blocks=["cameras"])
|
||||
#lights = bproc.loader.load_blend(args.scene, data_blocks=["lights"])
|
||||
|
||||
# загрузим объекты
|
||||
list_files = os.listdir(args.obj_path)
|
||||
obj_names = []
|
||||
obj_filenames = []
|
||||
all_meshs = []
|
||||
nObj = 0
|
||||
for f in list_files:
|
||||
if (os.path.splitext(f))[1] == ".obj":
|
||||
f = os.path.join(args.obj_path, f) # путь к файлу объекта
|
||||
if os.path.isfile(f):
|
||||
obj = bproc.loader.load_obj(f)
|
||||
all_meshs += obj
|
||||
obj_names += [obj[0].get_name()]
|
||||
obj_filenames += [f]
|
||||
nObj += 1
|
||||
|
||||
if nObj == 0:
|
||||
print("Objects not found")
|
||||
sys.exit()
|
||||
|
||||
for i,obj in enumerate(all_meshs):
|
||||
#print(f"{i} *** {obj}")
|
||||
obj.set_cp("category_id", i+1)
|
||||
|
||||
# загрузим сцену
|
||||
scene = bproc.loader.load_blend(args.scene, data_blocks=["objects"])
|
||||
|
||||
# найдём объекты коллизии (пол и т.д.)
|
||||
obj_type = ["floor", "obj"]
|
||||
collision_objects = []
|
||||
#floor = None
|
||||
for o in scene:
|
||||
o.set_cp("category_id", 999)
|
||||
s = o.get_name()
|
||||
for type in obj_type:
|
||||
if s.find(type) >= 0:
|
||||
collision_objects += [o]
|
||||
o.enable_rigidbody(False, collision_shape='BOX')
|
||||
if not collision_objects:
|
||||
print("Collision objects not found in the scene")
|
||||
sys.exit()
|
||||
|
||||
#floor.enable_rigidbody(False, collision_shape='BOX')
|
||||
|
||||
for obj in all_meshs:
|
||||
# Make the object actively participate in the physics simulation
|
||||
obj.enable_rigidbody(active=True, collision_shape="COMPOUND")
|
||||
# Also use convex decomposition as collision shapes
|
||||
obj.build_convex_decomposition_collision_shape(args.vhacd_path)
|
||||
|
||||
objs = all_meshs + scene
|
||||
|
||||
with open(os.path.join(args.output_dir,"res.txt"), "w") as fh:
|
||||
# fh.write(str(type(scene[0]))+"\n")
|
||||
i = 0
|
||||
for o in objs:
|
||||
i += 1
|
||||
loc = o.get_location()
|
||||
euler = o.get_rotation_euler()
|
||||
fh.write(f"{i} : {o.get_name()} {loc} {euler} category_id = {o.get_cp('category_id')}\n")
|
||||
|
||||
# define a light and set its location and energy level
|
||||
light = bproc.types.Light()
|
||||
light.set_type("POINT")
|
||||
light.set_location([5, -5, 5])
|
||||
#light.set_energy(900)
|
||||
#light.set_color([0.7, 0.7, 0.7])
|
||||
|
||||
light1 = bproc.types.Light(name="light1")
|
||||
light1.set_type("SUN")
|
||||
light1.set_location([0, 0, 0])
|
||||
light1.set_rotation_euler([-0.063, 0.6177, -0.1985])
|
||||
#light1.set_energy(7)
|
||||
light1.set_color([1, 1, 1])
|
||||
|
||||
# define the camera intrinsics
|
||||
bproc.camera.set_intrinsics_from_blender_params(1, 640, 480, lens_unit="FOV")
|
||||
|
||||
# add segmentation masks (per class and per instance)
|
||||
bproc.renderer.enable_segmentation_output(map_by=["category_id", "instance", "name"])
|
||||
#bproc.renderer.enable_segmentation_output(map_by=["category_id", "instance", "name", "bop_dataset_name"],
|
||||
# default_values={"category_id": 0, "bop_dataset_name": None})
|
||||
|
||||
# activate depth rendering
|
||||
bproc.renderer.enable_depth_output(activate_antialiasing=False)
|
||||
|
||||
res_dir = os.path.join(args.output_dir, "bop_data")
|
||||
if os.path.isdir(res_dir):
|
||||
shutil.rmtree(res_dir)
|
||||
# Цикл рендеринга
|
||||
n_cam_location = 3 #5 # количество случайных локаций камеры
|
||||
n_cam_poses = 3 #3 # количество сэмплов для каждой локации камеры
|
||||
# Do multiple times: Position the shapenet objects using the physics simulator and render X images with random camera poses
|
||||
for r in range(args.imgs):
|
||||
# один случайный объект в кадре / все заданные объекты
|
||||
meshs = [random.choice(all_meshs)] if single_object else all_meshs[:]
|
||||
|
||||
# Randomly set the color and energy
|
||||
light.set_color(np.random.uniform([0.5, 0.5, 0.5], [1, 1, 1]))
|
||||
light.set_energy(random.uniform(500, 1000))
|
||||
light1.set_energy(random.uniform(3, 11))
|
||||
|
||||
for i,o in enumerate(meshs): #objs
|
||||
mat = o.get_materials()[0]
|
||||
mat.set_principled_shader_value("Specular", random.uniform(0, 1))
|
||||
mat.set_principled_shader_value("Roughness", random.uniform(0, 1))
|
||||
mat.set_principled_shader_value("Base Color", np.random.uniform([0, 0, 0, 1], [1, 1, 1, 1]))
|
||||
mat.set_principled_shader_value("Metallic", random.uniform(0, 1))
|
||||
|
||||
# Clear all key frames from the previous run
|
||||
bproc.utility.reset_keyframes()
|
||||
|
||||
# Define a function that samples 6-DoF poses
|
||||
def sample_pose(obj: bproc.types.MeshObject):
|
||||
obj.set_location(np.random.uniform([-1, -1.5, 0.2], [1, 2, 1.2])) #[-1, -1, 0], [1, 1, 2]))
|
||||
obj.set_rotation_euler(bproc.sampler.uniformSO3())
|
||||
|
||||
# Sample the poses of all shapenet objects above the ground without any collisions in-between
|
||||
#bproc.object.sample_poses(meshs, objects_to_check_collisions = meshs + [floor], sample_pose_func = sample_pose)
|
||||
bproc.object.sample_poses(meshs, objects_to_check_collisions = meshs + collision_objects, sample_pose_func = sample_pose)
|
||||
|
||||
# Run the simulation and fix the poses of the shapenet objects at the end
|
||||
bproc.object.simulate_physics_and_fix_final_poses(min_simulation_time=4, max_simulation_time=20, check_object_interval=1)
|
||||
|
||||
# Find point of interest, all cam poses should look towards it
|
||||
poi = bproc.object.compute_poi(meshs)
|
||||
|
||||
coord_max = [0.1, 0.1, 0.1]
|
||||
coord_min = [0., 0., 0.]
|
||||
|
||||
with open(os.path.join(args.output_dir,"res.txt"), "a") as fh:
|
||||
fh.write("*****************\n")
|
||||
fh.write(f"{r}) poi = {poi}\n")
|
||||
i = 0
|
||||
for o in meshs:
|
||||
i += 1
|
||||
loc = o.get_location()
|
||||
euler = o.get_rotation_euler()
|
||||
fh.write(f" {i} : {o.get_name()} {loc} {euler}\n")
|
||||
for j in range(3):
|
||||
if loc[j] < coord_min[j]:
|
||||
coord_min[j] = loc[j]
|
||||
if loc[j] > coord_max[j]:
|
||||
coord_max[j] = loc[j]
|
||||
|
||||
# Sample up to X camera poses
|
||||
#an = np.random.uniform(0.78, 1.2) #1. #0.35
|
||||
for i in range(n_cam_location):
|
||||
# Sample location
|
||||
location = bproc.sampler.shell(center=[0, 0, 0],
|
||||
radius_min=1.1,
|
||||
radius_max=2.2,
|
||||
elevation_min=5,
|
||||
elevation_max=89)
|
||||
# координата, по которой будем сэмплировать положение камеры
|
||||
j = random.randint(0, 2)
|
||||
# разовый сдвиг по случайной координате
|
||||
d = (coord_max[j] - coord_min[j]) / n_cam_poses
|
||||
if location[j] < 0:
|
||||
d = -d
|
||||
for k in range(n_cam_poses):
|
||||
# Compute rotation based on vector going from location towards poi
|
||||
rotation_matrix = bproc.camera.rotation_from_forward_vec(poi - location, inplane_rot=np.random.uniform(-0.7854, 0.7854))
|
||||
# Add homog cam pose based on location an rotation
|
||||
cam2world_matrix = bproc.math.build_transformation_mat(location, rotation_matrix)
|
||||
bproc.camera.add_camera_pose(cam2world_matrix)
|
||||
location[j] -= d
|
||||
#world_matrix = bproc.math.build_transformation_mat([2.3, -0.4, 0.66], [1.396, 0., an])
|
||||
#bproc.camera.add_camera_pose(world_matrix)
|
||||
#an += 0.2
|
||||
|
||||
# render the whole pipeline
|
||||
data = bproc.renderer.render()
|
||||
# Write data to bop format
|
||||
bproc.writer.write_bop(res_dir,
|
||||
target_objects = all_meshs, # Optional[List[MeshObject]] = None
|
||||
depths = data["depth"],
|
||||
depth_scale = 1.0,
|
||||
colors = data["colors"],
|
||||
color_file_format='JPEG',
|
||||
append_to_existing_output = (r>0),
|
||||
save_world2cam = False) # world coords are arbitrary in most real BOP datasets
|
||||
# dataset="robo_ds",
|
||||
"""
|
||||
!!! categories -> name берётся из category_id !!!
|
||||
см.ниже
|
||||
blenderproc.python.writer : BopWriterUtility.py
|
||||
class _BopWriterUtility
|
||||
def calc_gt_coco
|
||||
...
|
||||
CATEGORIES = [{'id': obj.get_cp('category_id'), 'name': str(obj.get_cp('category_id')), 'supercategory':
|
||||
dataset_name} for obj in dataset_objects]
|
||||
|
||||
поэтому заменим наименование категории в аннотации
|
||||
"""
|
||||
if Not_Categories_Name:
|
||||
coco_file = os.path.join(res_dir,"train_pbr/000000/scene_gt_coco.json")
|
||||
with open(coco_file, "r") as fh:
|
||||
data = json.load(fh)
|
||||
cats = data["categories"]
|
||||
#print(f"type(cat) = {type(cat)} cat : {cat}")
|
||||
i = 0
|
||||
for cat in cats:
|
||||
cat["name"] = obj_names[i]
|
||||
i += 1
|
||||
#print(cat)
|
||||
with open(coco_file, "w") as fh:
|
||||
json.dump(data, fh, indent=0)
|
|
@ -1,5 +1,7 @@
|
|||
# Фреймворк Робосборщик
|
||||
|
||||
|
||||
|
||||
Фреймворк Робосборщик (Robossembler Framework) предназначен для автоматизации разработки управляющих программ для роботов-манипуляторов, их отладки в виртуальных средах и оценки производительности.
|
||||
|
||||
Фреймворк состоит из следующих функциональных модулей
|
||||
|
@ -21,3 +23,9 @@
|
|||
- __Предикат стабильной осуществимости__. Верен для последовательности сборки, когда сборка на каждом из этапов приходит к стабильному состоянию.
|
||||
- __Предикат степеней свободы__. Формируется на основе уже сгенерированных графов/графа сборки. В каких степенях свободы возможно перемещать деталь.
|
||||
|
||||
# Генерация сцен
|
||||
|
||||
TODO: составить описание
|
||||
|
||||
[пример файла описания сцены](docs/scene_generator)
|
||||
|
||||
|
|
26
asp-review-app/.gitignore
vendored
26
asp-review-app/.gitignore
vendored
|
@ -1,26 +0,0 @@
|
|||
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
|
||||
|
||||
# dependencies
|
||||
node_modules
|
||||
/.pnp
|
||||
.pnp.js
|
||||
|
||||
# testing
|
||||
/coverage
|
||||
|
||||
# production
|
||||
/build
|
||||
|
||||
# misc
|
||||
.DS_Store
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
**/node_modules
|
||||
server/public/
|
||||
**/computed/
|
2719
asp-review-app/server/package-lock.json
generated
2719
asp-review-app/server/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
@ -1,45 +0,0 @@
|
|||
{
|
||||
"name": "express-typescript",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "index.js",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"build": "npx tsc",
|
||||
"start": "npx tsc && node --experimental-specifier-resolution=node dist/server.js",
|
||||
"dev": "nodemon --exec ts-node --esm --transpileOnly ./src/server.ts"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"devDependencies": {
|
||||
"@types/compression": "^1.7.2",
|
||||
"@types/cors": "^2.8.13",
|
||||
"@types/express": "^4.17.17",
|
||||
"@types/express-fileupload": "^1.4.1",
|
||||
"@types/mongoose": "^5.11.97",
|
||||
"@types/node": "^17.0.45",
|
||||
"typescript": "^4.9.5"
|
||||
},
|
||||
"dependencies": {
|
||||
"body-parser": "^1.20.2",
|
||||
"class-transformer": "^0.5.1",
|
||||
"class-validator": "^0.14.0",
|
||||
"compression": "^1.7.4",
|
||||
"concurrently": "^8.0.1",
|
||||
"cors": "^2.8.5",
|
||||
"decompress": "^4.2.1",
|
||||
"express": "^4.18.2",
|
||||
"express-cross": "^1.0.0",
|
||||
"express-fileupload": "^1.4.0",
|
||||
"first-di": "^1.0.11",
|
||||
"form-data": "^4.0.0",
|
||||
"lodash": "^4.17.21",
|
||||
"morgan": "^1.10.0",
|
||||
"multer": "^1.4.5-lts.1",
|
||||
"node-stream-zip": "^1.15.0",
|
||||
"nodemon": "^2.0.22",
|
||||
"shelljs": "^0.8.5",
|
||||
"ts-node": "^10.9.1"
|
||||
}
|
||||
}
|
|
@ -1,71 +0,0 @@
|
|||
import express from "express";
|
||||
import compression from "compression";
|
||||
import cors from "cors";
|
||||
import { Routes } from "./core/interfaces/router";
|
||||
|
||||
import bodyParser from "body-parser";
|
||||
import fileUpload from "express-fileupload";
|
||||
import { DevEnv } from "./core/env/env";
|
||||
import path from 'path';
|
||||
import { locator } from "./core/di/register_di";
|
||||
export const dirname = path.resolve();
|
||||
|
||||
const corsOptions = {
|
||||
origin: process.env.CORS_ALLOW_ORIGIN || '*',
|
||||
methods: ['GET', 'PUT', 'POST', 'DELETE', 'OPTIONS'],
|
||||
allowedHeaders: ['Content-Type', 'Authorization']
|
||||
};
|
||||
export class App {
|
||||
public app: express.Application;
|
||||
|
||||
public port: string | number;
|
||||
|
||||
public env: string;
|
||||
|
||||
constructor(routes: Routes[], port) {
|
||||
this.app = express();
|
||||
this.port = port;
|
||||
this.env = process.env.NODE_ENV || "development";
|
||||
this.initializeMiddleware();
|
||||
this.initializeRoutes(routes);
|
||||
this.loadAppDependencies();
|
||||
}
|
||||
|
||||
public listen() {
|
||||
this.app.listen(this.port, () => {
|
||||
console.info(`=================================`);
|
||||
console.info(`======= ENV: ${this.env} =======`);
|
||||
console.info(`🚀 App listening on the port ${this.port}`);
|
||||
console.info(`=================================`);
|
||||
});
|
||||
}
|
||||
|
||||
public getServer() {
|
||||
return this.app;
|
||||
}
|
||||
|
||||
private initializeMiddleware() {
|
||||
this.app.use(
|
||||
cors(corsOptions)
|
||||
);
|
||||
this.app.use(compression());
|
||||
this.app.use(express.json());
|
||||
this.app.use(express.urlencoded({ extended: true }));
|
||||
this.app.use(bodyParser.json());
|
||||
this.app.use(bodyParser.urlencoded({ extended: true }));
|
||||
this.app.use(express.static(dirname + '/public/'));
|
||||
this.app.use(fileUpload({
|
||||
createParentPath: true
|
||||
}));
|
||||
}
|
||||
|
||||
private initializeRoutes(routes: Routes[]) {
|
||||
routes.forEach((route) => {
|
||||
this.app.use("/", route.router);
|
||||
});
|
||||
}
|
||||
|
||||
loadAppDependencies() {
|
||||
locator(new DevEnv());
|
||||
}
|
||||
}
|
|
@ -1,28 +0,0 @@
|
|||
|
||||
import { override } from "first-di";
|
||||
import { Env } from "../env/env";
|
||||
import { AssemblyController } from "../../features/assembly_create/assembly_create_controller";
|
||||
import { AssemblyPreviewsController } from "../../features/assembly_previews/assembly_previews_controller";
|
||||
import { EntityRepository } from "../repository/entity_repository";
|
||||
import { ZipRepository } from "../repository/zip_repository";
|
||||
import { ComputeRepository } from "../repository/compute_repository";
|
||||
|
||||
|
||||
export const locator = (env: Env) => {
|
||||
// override(Env, env)
|
||||
registerController(env)
|
||||
registerRepository(env)
|
||||
|
||||
};
|
||||
const registerRepository = (env:Env) => {
|
||||
|
||||
override(ZipRepository, ZipRepository);
|
||||
override(EntityRepository, EntityRepository);
|
||||
override(ComputeRepository,ComputeRepository);
|
||||
|
||||
}
|
||||
const registerController = (env: Env) => {
|
||||
override(AssemblyController,AssemblyController)
|
||||
override(AssemblyPreviewsController, AssemblyPreviewsController)
|
||||
|
||||
}
|
|
@ -1,10 +0,0 @@
|
|||
export class HttpException extends Error {
|
||||
public status: number;
|
||||
public message: string;
|
||||
|
||||
constructor(status: number, message: string) {
|
||||
super(message);
|
||||
this.status = status;
|
||||
this.message = message;
|
||||
}
|
||||
}
|
|
@ -1,191 +0,0 @@
|
|||
interface MemoOptions<F extends Fn, S extends unknown[] = unknown[]> {
|
||||
serialize?: (...args: Parameters<F>) => S;
|
||||
}
|
||||
interface MemoAsyncOptions<F extends Fn> extends MemoOptions<F> {
|
||||
external?: {
|
||||
get: (args: Parameters<F>) => Promise<Awaited<ReturnType<F>> | undefined | null>;
|
||||
|
||||
set: (args: Parameters<F>, value: Awaited<ReturnType<F>>) => Promise<void>;
|
||||
|
||||
remove: (args: Parameters<F>) => Promise<void>;
|
||||
|
||||
clear: () => Promise<void>;
|
||||
};
|
||||
}
|
||||
|
||||
type Fn = (...params: any[]) => any;
|
||||
|
||||
type AsyncFn = (...params: any[]) => Promise<any>;
|
||||
|
||||
interface MemoFunc<F extends Fn> {
|
||||
(...args: Parameters<F>): ReturnType<F>;
|
||||
|
||||
get(...args: Parameters<F>): ReturnType<F>;
|
||||
|
||||
raw(...args: Parameters<F>): ReturnType<F>;
|
||||
|
||||
clear(...args: Parameters<F> | []): void | Promise<void>;
|
||||
}
|
||||
|
||||
export const enum State {
|
||||
Empty,
|
||||
Ok,
|
||||
Waiting,
|
||||
Error
|
||||
}
|
||||
|
||||
export interface Node<T extends Fn> {
|
||||
state: State;
|
||||
value: ReturnType<T> | undefined;
|
||||
error: unknown;
|
||||
primitive: Map<any, Node<T>>;
|
||||
reference: WeakMap<any, Node<T>>;
|
||||
callbacks?: Set<{ res: (value: ReturnType<T>) => void; rej: (error: unknown) => void }>;
|
||||
}
|
||||
|
||||
function makeNode<T extends Fn>(): Node<T> {
|
||||
return {
|
||||
state: State.Empty,
|
||||
value: undefined,
|
||||
error: undefined,
|
||||
primitive: new Map(),
|
||||
reference: new WeakMap()
|
||||
};
|
||||
}
|
||||
|
||||
function clearNode<T extends Fn>(node: Node<T> | undefined) {
|
||||
if (node) {
|
||||
node.state = State.Empty;
|
||||
node.value = undefined;
|
||||
node.error = undefined;
|
||||
node.primitive = new Map();
|
||||
node.reference = new WeakMap();
|
||||
}
|
||||
}
|
||||
function isPrimitiveType(value: unknown) {
|
||||
return (typeof value !== 'object' && typeof value !== 'function') || value === null;
|
||||
}
|
||||
function walkBase<T extends Fn, P extends any[] = Parameters<T>>(
|
||||
node: Node<T>,
|
||||
args: P,
|
||||
hooks: { makeNode: () => Node<T> | undefined }
|
||||
): Node<T> | undefined {
|
||||
let cur = node;
|
||||
for (const arg of args) {
|
||||
if (isPrimitiveType(arg)) {
|
||||
if (cur.primitive.has(arg)) {
|
||||
cur = cur.primitive.get(arg)!;
|
||||
} else {
|
||||
const newNode = hooks.makeNode();
|
||||
if (newNode) {
|
||||
cur.primitive.set(arg, newNode);
|
||||
cur = newNode;
|
||||
} else {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if (cur.reference.has(arg)) {
|
||||
cur = cur.reference.get(arg)!;
|
||||
} else {
|
||||
const newNode = hooks.makeNode();
|
||||
if (newNode) {
|
||||
cur.reference.set(arg, newNode);
|
||||
cur = newNode;
|
||||
} else {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return cur;
|
||||
}
|
||||
|
||||
function walkAndCreate<T extends Fn, P extends any[] = Parameters<T>>(
|
||||
node: Node<T>,
|
||||
args: P
|
||||
) {
|
||||
return walkBase(node, args, { makeNode })!;
|
||||
}
|
||||
|
||||
function walkOrBreak<T extends Fn, P extends any[] = Parameters<T>>(node: Node<T>, args: P) {
|
||||
return walkBase(node, args, { makeNode: () => undefined });
|
||||
}
|
||||
export function memoAsync<F extends AsyncFn>(
|
||||
fn: F,
|
||||
options: MemoAsyncOptions<F> = {}
|
||||
): MemoFunc<F> {
|
||||
const root = makeNode<F>();
|
||||
|
||||
const memoFunc = async function (...args: Parameters<F>) {
|
||||
const path = options.serialize ? options.serialize(...args) : args;
|
||||
const cur = walkAndCreate<F, any[]>(root, path);
|
||||
|
||||
if (cur.state === State.Ok) {
|
||||
return cur.value;
|
||||
} else if (cur.state === State.Error) {
|
||||
throw cur.error;
|
||||
} else if (cur.state === State.Waiting) {
|
||||
return new Promise((res, rej) => {
|
||||
if (!cur.callbacks) {
|
||||
cur.callbacks = new Set();
|
||||
}
|
||||
cur.callbacks!.add({ res, rej });
|
||||
});
|
||||
} else {
|
||||
try {
|
||||
cur.state = State.Waiting;
|
||||
|
||||
const external = options.external ? await options.external.get(args) : undefined;
|
||||
const value = external !== undefined && external !== null ? external : await fn(...args);
|
||||
|
||||
cur.state = State.Ok;
|
||||
cur.value = value;
|
||||
|
||||
if (options.external) {
|
||||
await options.external.set(args, value);
|
||||
}
|
||||
|
||||
for (const callback of cur.callbacks ?? []) {
|
||||
callback.res(value);
|
||||
}
|
||||
|
||||
return value;
|
||||
} catch (error) {
|
||||
cur.state = State.Error;
|
||||
cur.error = error;
|
||||
|
||||
for (const callback of cur.callbacks ?? []) {
|
||||
callback.rej(error);
|
||||
}
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
} as MemoFunc<F>;
|
||||
|
||||
memoFunc.get = (...args) => {
|
||||
return memoFunc(...args);
|
||||
};
|
||||
|
||||
memoFunc.raw = (...args) => {
|
||||
return fn(...args) as ReturnType<F>;
|
||||
};
|
||||
|
||||
memoFunc.clear = async (...args) => {
|
||||
if (args.length === 0) {
|
||||
clearNode(root);
|
||||
if (options.external) {
|
||||
await options.external.clear();
|
||||
}
|
||||
} else {
|
||||
const cur = walkOrBreak<F>(root, args as Parameters<F>);
|
||||
clearNode(cur);
|
||||
if (options.external) {
|
||||
await options.external.remove(args as Parameters<F>);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
return memoFunc;
|
||||
}
|
|
@ -1,6 +0,0 @@
|
|||
import { Router } from "express";
|
||||
|
||||
export interface Routes {
|
||||
path?: string;
|
||||
router: Router;
|
||||
}
|
|
@ -1,25 +0,0 @@
|
|||
import { HttpException } from '../exceptions/HttpException';
|
||||
import { plainToClass } from 'class-transformer';
|
||||
import { validate, ValidationError } from 'class-validator';
|
||||
import { RequestHandler } from 'express';
|
||||
|
||||
const validationMiddleware = (
|
||||
type: any,
|
||||
value = 'body',
|
||||
skipMissingProperties = false,
|
||||
whitelist = true,
|
||||
forbidNonWhitelisted = true,
|
||||
): RequestHandler => {
|
||||
return (req, res, next) => {
|
||||
validate(plainToClass(type, req[value]), { skipMissingProperties, whitelist, forbidNonWhitelisted }).then((errors: ValidationError[]) => {
|
||||
if (errors.length > 0) {
|
||||
const message = errors.map((error: ValidationError) => Object.values(error.constraints)).join(', ');
|
||||
next(new HttpException(400, message));
|
||||
} else {
|
||||
next();
|
||||
}
|
||||
});
|
||||
};
|
||||
};
|
||||
|
||||
export default validationMiddleware;
|
|
@ -1,76 +0,0 @@
|
|||
import { reflection } from 'first-di';
|
||||
import "reflect-metadata";
|
||||
import { promises as fs } from 'fs';
|
||||
import { async } from 'node-stream-zip';
|
||||
import * as cp from 'child_process';
|
||||
|
||||
import path from 'path';
|
||||
|
||||
async function exec(cmd: string, opts: (cp.ExecOptions & { trim?: boolean }) = {}): Promise<string> {
|
||||
return new Promise((c, e) => {
|
||||
cp.exec(cmd, { env: process.env, ...opts }, (err, stdout) => err ? e(err) : c(opts.trim ? stdout.trim() : stdout));
|
||||
});
|
||||
}
|
||||
|
||||
@reflection
|
||||
export class ComputeRepository {
|
||||
public computedAdjaxedMatrix = async (outPath: string, cadEntity: string, entityId: string) => {
|
||||
const envPath = '/home/idontsudo/t/framework/asp-review-app/server/computed/geometric_feasibility_predicate/env.json'
|
||||
const computedScript = '/home/idontsudo/t/framework/asp-review-app/server/computed/geometric_feasibility_predicate/main.py'
|
||||
const computedComand = 'freecadcmd'
|
||||
|
||||
const env = JSON.parse((await fs.readFile(envPath)).toString())
|
||||
env['cadFilePath'] = cadEntity
|
||||
env['outPath'] = outPath
|
||||
await fs.writeFile(envPath, JSON.stringify(env))
|
||||
// console.log(this._computedPath(computedScript))
|
||||
exec(computedComand + ' ' + computedScript, { cwd: this._computedPath(computedScript) }).then((data) => {
|
||||
console.log(data)
|
||||
})
|
||||
this.cadGeneration(cadEntity, entityId, outPath)
|
||||
// if (stderr) {
|
||||
// console.log(stderr)
|
||||
// }
|
||||
// console.log(stdout)
|
||||
};
|
||||
public computedWriteStability = async (assemblyFolder: string, buildNumber: string, id: string) => {
|
||||
const computedScript = '/home/idontsudo/t/framework/cad_stability_input/main.py'
|
||||
const computedComand = 'freecad'
|
||||
const envPath = '/home/idontsudo/t/framework/cad_stability_input/env.json'
|
||||
const env = JSON.parse((await fs.readFile(envPath)).toString())
|
||||
env.assemblyFolder = assemblyFolder
|
||||
env['projectId'] = id
|
||||
env['buildNumber'] = buildNumber
|
||||
env['assemblyFolder'] = assemblyFolder
|
||||
env['resultURL'] = 'http://localhost:3002/assembly/stabilty/create/?id=' + id + '&' + 'buildNumber=' + buildNumber
|
||||
|
||||
await fs.writeFile(envPath, JSON.stringify(env))
|
||||
await exec(computedComand + ' ' + computedScript, { cwd: this._computedPath(computedScript) })
|
||||
}
|
||||
|
||||
private _computedPath(f: string) {
|
||||
|
||||
const file = path.basename(f);
|
||||
const absolutPath = path.resolve(f)
|
||||
return absolutPath.replace(file, '')
|
||||
}
|
||||
|
||||
public cadGeneration = async (cadEntity, entity: string, outPath: string,) => {
|
||||
const computedScript = '/home/idontsudo/t/framework/cad_generation/main.py'
|
||||
const computedComand = 'freecad'
|
||||
const envPath = '/home/idontsudo/t/framework/cad_generation/env.json'
|
||||
|
||||
const env = JSON.parse((await fs.readFile(envPath)).toString())
|
||||
env.doc = cadEntity
|
||||
env.projectId = entity
|
||||
env.resultURL = "http://localhost:3002/assembly/save/out"
|
||||
|
||||
await fs.writeFile(envPath, JSON.stringify(env))
|
||||
// /stabilty/create
|
||||
|
||||
exec(computedComand + ' ' + computedScript, { cwd: this._computedPath(computedScript) }).then((data) => {
|
||||
console.log(data)
|
||||
})
|
||||
}
|
||||
|
||||
}
|
|
@ -1,87 +0,0 @@
|
|||
import { promises as fs } from 'fs';
|
||||
import { dirname } from '../../app';
|
||||
import fsSync from "fs";
|
||||
import { autowired, reflection } from 'first-di';
|
||||
import "reflect-metadata";
|
||||
import { ComputeRepository } from './compute_repository';
|
||||
import { ZipRepository } from './zip_repository';
|
||||
|
||||
@reflection
|
||||
export class EntityRepository {
|
||||
|
||||
@autowired()
|
||||
private readonly computedRepository: ComputeRepository;
|
||||
@autowired()
|
||||
private readonly zipRepository: ZipRepository;
|
||||
|
||||
private path: String = dirname + '/public/'
|
||||
|
||||
|
||||
private getFileName(file: String) {
|
||||
return file.slice(0, file.indexOf('.'))
|
||||
}
|
||||
|
||||
public async getDir(path) {
|
||||
return this._fullPath(await fs.readdir(path + ''), duplicatedDelete(this.path, path))
|
||||
}
|
||||
|
||||
public isExistDirPath(path: String): boolean {
|
||||
return fsSync.existsSync(path + '')
|
||||
}
|
||||
|
||||
public async saveRootEntity(buffer: Buffer, name: string) {
|
||||
const filePath = this.path + this.getFileName(name) + '/'
|
||||
|
||||
if (this.isExistDirPath(filePath)) {
|
||||
await fs.rm(filePath, { recursive: true })
|
||||
}
|
||||
await fs.mkdir(filePath);
|
||||
await fs.writeFile(filePath + name, buffer);
|
||||
this.computedRepository.computedAdjaxedMatrix(filePath, filePath + name, this.getFileName(name))
|
||||
}
|
||||
|
||||
public async getAllRootEntity() {
|
||||
return await fs.readdir('' + this.path)
|
||||
}
|
||||
|
||||
public async getEntityStorage(entity: string): Promise<String[]> | undefined {
|
||||
return this._fullPath(await fs.readdir(this.path + entity), entity + '/')
|
||||
}
|
||||
|
||||
private _fullPath(folderPath, helpElement = '') {
|
||||
return folderPath.map((el) => this.path + helpElement + el)
|
||||
}
|
||||
public async readJson<T>(path) {
|
||||
return JSON.parse((await fs.readFile(path)).toString())
|
||||
}
|
||||
public async saveGeration(data: Buffer, id: String) {
|
||||
const rootFolderPath = '' + this.path + id + '/'
|
||||
console.log(rootFolderPath)
|
||||
this.zipRepository.archive(rootFolderPath, data)
|
||||
}
|
||||
public computedStability(id: string, buildNumber: string) {
|
||||
const assemblyFolder = this.path + id + '/generation/'
|
||||
this.computedRepository.computedWriteStability(assemblyFolder, buildNumber, id)
|
||||
}
|
||||
public async saveStability(zip: Buffer, id:string, buildNumber:string) {
|
||||
const filePath = await this.zipRepository.archive(this.path as string, zip)
|
||||
// const buildNumber = data['buildNumber']
|
||||
const assemblyFolder = this.path + id + '/generation/stability/'
|
||||
|
||||
if (!this.isExistDirPath(assemblyFolder)) {
|
||||
await fs.mkdir(assemblyFolder);
|
||||
}
|
||||
await this.zipRepository.archive(assemblyFolder as string, zip, buildNumber)
|
||||
fs.rmdir(filePath + '/', { recursive: true})
|
||||
|
||||
}
|
||||
}
|
||||
function duplicatedDelete(strChild: String, strMain: String) {
|
||||
let result = ''
|
||||
for (let i = 0; i < strMain.length; i++) {
|
||||
if (!(strMain[i] === strChild[i])) {
|
||||
result += strMain[i]
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
|
@ -1,13 +0,0 @@
|
|||
import StreamZip from 'node-stream-zip';
|
||||
import { promises as fs } from 'fs';
|
||||
import decompress from 'decompress'
|
||||
|
||||
export class ZipRepository {
|
||||
public async archive(outhPath: string, zipFile: Buffer, name='generation') {
|
||||
const entry = outhPath + 'archive.zip'
|
||||
await fs.writeFile(entry, zipFile)
|
||||
await decompress(entry, outhPath + name);
|
||||
fs.rm(entry)
|
||||
return outhPath + name
|
||||
}
|
||||
}
|
|
@ -1,5 +0,0 @@
|
|||
import { AssemblyRoute } from "../../features/assembly_create/assembly_create_route";
|
||||
import { AssemblyPreviewsRoute } from "../../features/assembly_previews/assembly_previews_route";
|
||||
|
||||
|
||||
export const routes = [new AssemblyRoute(), new AssemblyPreviewsRoute()];
|
|
@ -1,101 +0,0 @@
|
|||
import { NextFunction, Request, Response } from "express";
|
||||
import { autowired } from "first-di";
|
||||
import { async } from "node-stream-zip";
|
||||
import { EntityRepository } from "../../core/repository/entity_repository";
|
||||
import { IFile } from "./model/zip_files_model";
|
||||
|
||||
export class AssemblyController {
|
||||
@autowired()
|
||||
private readonly entityRepository: EntityRepository;
|
||||
|
||||
public createRootEntity = (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
) => {
|
||||
const file = req.files;
|
||||
const cadFile = file["freecad"] as IFile;
|
||||
|
||||
this.entityRepository.saveRootEntity(cadFile.data, cadFile.name);
|
||||
|
||||
res.status(200).json("ok");
|
||||
return;
|
||||
};
|
||||
|
||||
public getAllAssembly = (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
): void => { };
|
||||
|
||||
public createAssembly = (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
): void => {
|
||||
try {
|
||||
const file = req.files.freecad as IFile;
|
||||
const buffer = file.data as Buffer;
|
||||
this.entityRepository.saveRootEntity(file.data, file.name);
|
||||
res.sendStatus(200);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
public test = (req: Request,
|
||||
res: Response,
|
||||
next: NextFunction) => {
|
||||
try {
|
||||
const file = req.files;
|
||||
|
||||
const generation = file["zip"] as IFile;
|
||||
const id = 'cubes';
|
||||
|
||||
this.entityRepository.saveGeration(generation.data, id)
|
||||
res.sendStatus(200);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
public stabilityComputed = async (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
) => {
|
||||
try {
|
||||
// const file = req.files;
|
||||
console.log(req.body)
|
||||
const id = req.body.id;
|
||||
// console.log(req.query.id)
|
||||
const buildNumber = req.body.buildNumber;
|
||||
console.log(buildNumber)
|
||||
console.log(id)
|
||||
// const generation = file["zip"] as IFile;
|
||||
// const id = 'cubes';
|
||||
|
||||
await this.entityRepository.computedStability(id, buildNumber)
|
||||
res.sendStatus(200);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
public stabilityCreate = (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
) => {
|
||||
try {
|
||||
const files = req.files;
|
||||
const zip = files['zip'] as IFile
|
||||
const query = req.query as any
|
||||
this.entityRepository.saveStability(zip.data, query.id, query.buildNumber)
|
||||
res.sendStatus(200);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
}
|
|
@ -1,46 +0,0 @@
|
|||
import express, { Router } from "express";
|
||||
import { Routes } from "../../core/interfaces/router";
|
||||
import { autowired } from "first-di";
|
||||
import { AssemblyController } from "./assembly_create_controller";
|
||||
import validationMiddleware from "../../core/middlewares/ValidationMiddleware";
|
||||
import { CadFilesModel } from "./model/zip_files_model";
|
||||
|
||||
export class AssemblyRoute implements Routes {
|
||||
public path = "/assembly";
|
||||
public router = Router();
|
||||
|
||||
@autowired()
|
||||
private readonly assemblyController: AssemblyController;
|
||||
|
||||
constructor() {
|
||||
this.initializeRoutes();
|
||||
}
|
||||
|
||||
private initializeRoutes() {
|
||||
this.router.post(
|
||||
`${this.path}`,
|
||||
validationMiddleware(CadFilesModel, "files"),
|
||||
this.assemblyController.createAssembly
|
||||
);
|
||||
this.router.post(
|
||||
`${this.path}/save/out`,
|
||||
// validationMiddleware(CadFilesModel, "files"),
|
||||
this.assemblyController.test
|
||||
);
|
||||
|
||||
this.router.get(`${this.path}`, this.assemblyController.getAllAssembly);
|
||||
|
||||
this.router.post(
|
||||
`${this.path}/create`,
|
||||
this.assemblyController.createRootEntity
|
||||
);
|
||||
this.router.post(
|
||||
`${this.path}/stability/write/computed`,
|
||||
this.assemblyController.stabilityComputed
|
||||
);
|
||||
this.router.post(
|
||||
`${this.path}/stabilty/create/`,
|
||||
this.assemblyController.stabilityCreate
|
||||
);
|
||||
}
|
||||
}
|
|
@ -1,23 +0,0 @@
|
|||
import { IsArray, IsObject } from "class-validator";
|
||||
|
||||
export interface IFile {
|
||||
name: string,
|
||||
data: Buffer,
|
||||
size: Number,
|
||||
encoding: string,
|
||||
tempFilePath: string,
|
||||
truncated: Boolean,
|
||||
mimetype: string,
|
||||
md5: string,
|
||||
}
|
||||
|
||||
interface ICadFileModel {
|
||||
freecad: IFile;
|
||||
}
|
||||
|
||||
export class CadFilesModel implements ICadFileModel {
|
||||
@IsObject()
|
||||
public freecad: IFile;
|
||||
}
|
||||
|
||||
|
|
@ -1,156 +0,0 @@
|
|||
import { NextFunction, Request, Response } from "express";
|
||||
import { autowired } from "first-di";
|
||||
import { EntityRepository } from "../../core/repository/entity_repository";
|
||||
import { port } from "../../server";
|
||||
import { memoAsync } from "../../core/helper/memorization";
|
||||
|
||||
export class AssemblyPreviewsController {
|
||||
@autowired()
|
||||
private readonly entityRepository: EntityRepository;
|
||||
|
||||
public getAllAssembly = async (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
): Promise<void> => {
|
||||
try {
|
||||
res.send(await this.entityRepository.getAllRootEntity());
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
};
|
||||
|
||||
public getAssemblySubsequenceById = async (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
): Promise<void> => {
|
||||
try {
|
||||
const entity = await this.entityRepository.getEntityStorage(
|
||||
req.params.id
|
||||
);
|
||||
|
||||
const aspUsage = Number(req.query.count) - 1;
|
||||
|
||||
if (entity === undefined) {
|
||||
res.status(404).json("entity not found");
|
||||
return;
|
||||
}
|
||||
|
||||
res.json(
|
||||
await this._assemblyCompute(
|
||||
aspUsage,
|
||||
entity,
|
||||
this.entityRepository,
|
||||
req.hostname,
|
||||
req.params.id
|
||||
)
|
||||
);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
};
|
||||
|
||||
public getAssemblyInsertionSequenceById = async (
|
||||
req: Request,
|
||||
res: Response,
|
||||
next: NextFunction
|
||||
) => {
|
||||
const entity = await this.entityRepository.getEntityStorage(req.params.id);
|
||||
|
||||
const aspUsage = Number(req.query.count);
|
||||
|
||||
const assemblyFolder = entity.find((el) => {
|
||||
return el.match("assembly");
|
||||
});
|
||||
|
||||
const asmCountFolder = "0000" + aspUsage;
|
||||
|
||||
const assemblyDirPath = assemblyFolder + "/" + asmCountFolder;
|
||||
|
||||
if (!this.entityRepository.isExistDirPath(assemblyDirPath)) {
|
||||
return res.status(400).json({ error: "bad request" });
|
||||
}
|
||||
|
||||
const assemblyProcessDir = await this.entityRepository.getDir(
|
||||
assemblyDirPath + "/process/"
|
||||
);
|
||||
|
||||
const firstObj = assemblyProcessDir.find((el) => {
|
||||
return el.match("1.obj");
|
||||
});
|
||||
|
||||
const zeroObj = await assemblyProcessDir.find((el) => {
|
||||
return el.match("0.obj");
|
||||
});
|
||||
|
||||
const insertions = await this.entityRepository.readJson(
|
||||
assemblyDirPath + "/" + "insertion_path.json"
|
||||
);
|
||||
|
||||
if (
|
||||
insertions === undefined ||
|
||||
zeroObj === undefined ||
|
||||
firstObj === undefined
|
||||
) {
|
||||
res.status(400).json({ error: "bad" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({
|
||||
offset: aspUsage,
|
||||
count: 4,
|
||||
parent: `http://${req.hostname}:${port}/${
|
||||
req.params.id
|
||||
}/assembly/${asmCountFolder}/${0}.obj`,
|
||||
|
||||
child: `http://${req.hostname}:${port}/${
|
||||
req.params.id
|
||||
}/assembly/${asmCountFolder}/${1}.obj`,
|
||||
|
||||
insertions: insertions,
|
||||
});
|
||||
return;
|
||||
};
|
||||
private async _assemblyCompute(
|
||||
id: number,
|
||||
entityFolder: Array<String>,
|
||||
repository: EntityRepository,
|
||||
host: string,
|
||||
entity: string
|
||||
) {
|
||||
const assemblySequence = entityFolder.find((el) => {
|
||||
return el.match("step-structure.json");
|
||||
});
|
||||
|
||||
const assembly: Array<String> = await repository.readJson<Array<String>>(
|
||||
assemblySequence
|
||||
);
|
||||
|
||||
if (id == 0) {
|
||||
return {
|
||||
assembly: [
|
||||
`http://${host}:${port}/${entity}/sdf/meshes/${assembly[id]}.obj`,
|
||||
],
|
||||
offset: 1,
|
||||
count: assemblySequence.length,
|
||||
};
|
||||
} else {
|
||||
const assemblyIndexed = assembly
|
||||
.map((_item, index) => {
|
||||
if (index <= id) {
|
||||
return index;
|
||||
}
|
||||
})
|
||||
.filter((el) => el != undefined);
|
||||
return {
|
||||
assembly: assemblyIndexed.map((el) => {
|
||||
return `http://${host}:${port}/${entity}/sdf/meshes/${assembly[el]}.obj`;
|
||||
}),
|
||||
count: assemblyIndexed.length,
|
||||
offset: assembly.length,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -1,29 +0,0 @@
|
|||
import express, { Router } from 'express';
|
||||
import { Routes } from '../../core/interfaces/router';
|
||||
import { autowired } from 'first-di';
|
||||
// import { AssemblyController } from './assembly_create_controller';
|
||||
import path from 'path';
|
||||
import { dirname } from '../../app';
|
||||
import validationMiddleware from '../../core/middlewares/ValidationMiddleware';
|
||||
import { AssemblyPreviewsController } from './assembly_previews_controller';
|
||||
// import { CadFilesModel } from './model/zip_files_model';
|
||||
|
||||
export class AssemblyPreviewsRoute implements Routes {
|
||||
public path = '/assembly/preview/';
|
||||
public router = Router();
|
||||
@autowired()
|
||||
private readonly assemblyPreviewsController: AssemblyPreviewsController;
|
||||
constructor() {
|
||||
this.initializeRoutes();
|
||||
}
|
||||
|
||||
private initializeRoutes() {
|
||||
this.router.get(`${this.path}`, this.assemblyPreviewsController.getAllAssembly);
|
||||
// this.router.get(`${this.path}`)
|
||||
this.router.get(`${this.path}subsequence/:id`, this.assemblyPreviewsController.getAssemblySubsequenceById)
|
||||
this.router.get(`${this.path}insertion_sequence/:id`, this.assemblyPreviewsController.getAssemblyInsertionSequenceById)
|
||||
// this.router.post(`${this.path}`, validationMiddleware(CadFilesModel, 'files'), this.assemblyController.createAssembly)
|
||||
|
||||
// this.router.get(`${this.path}`, this.assemblyController.getAllAssembly)
|
||||
}
|
||||
}
|
|
@ -1,14 +0,0 @@
|
|||
import { App } from "./app";
|
||||
import { routes } from "./core/routes/routes";
|
||||
import "reflect-metadata";
|
||||
|
||||
export const port = 3002
|
||||
|
||||
|
||||
const app = new App(routes,port);
|
||||
|
||||
|
||||
function main() {
|
||||
app.listen();
|
||||
}
|
||||
main();
|
|
@ -1,28 +0,0 @@
|
|||
{
|
||||
"compileOnSave": false,
|
||||
"compilerOptions": {
|
||||
"target": "es2017",
|
||||
"lib": ["es2017", "esnext.asynciterable"],
|
||||
"typeRoots": ["node_modules/@types"],
|
||||
"allowSyntheticDefaultImports": true,
|
||||
"experimentalDecorators": true,
|
||||
"emitDecoratorMetadata": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"moduleResolution": "node",
|
||||
"module": "ESNext",
|
||||
"pretty": true,
|
||||
"sourceMap": true,
|
||||
"declaration": true,
|
||||
"outDir": "./dist",
|
||||
"allowJs": true,
|
||||
"noEmit": false,
|
||||
"esModuleInterop": true,
|
||||
"resolveJsonModule": true,
|
||||
},
|
||||
"ts-node": {
|
||||
"esm": true,
|
||||
"experimentalSpecifierResolution": "node",
|
||||
},
|
||||
"include": ["src/**/*.ts", "src/**/*.json", ".env"],
|
||||
"exclude": ["node_modules"]
|
||||
}
|
|
@ -1,46 +0,0 @@
|
|||
# Getting Started with Create React App
|
||||
|
||||
This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app).
|
||||
|
||||
## Available Scripts
|
||||
|
||||
In the project directory, you can run:
|
||||
|
||||
### `yarn start`
|
||||
|
||||
Runs the app in the development mode.\
|
||||
Open [http://localhost:3000](http://localhost:3000) to view it in the browser.
|
||||
|
||||
The page will reload if you make edits.\
|
||||
You will also see any lint errors in the console.
|
||||
|
||||
### `yarn test`
|
||||
|
||||
Launches the test runner in the interactive watch mode.\
|
||||
See the section about [running tests](https://facebook.github.io/create-react-app/docs/running-tests) for more information.
|
||||
|
||||
### `yarn build`
|
||||
|
||||
Builds the app for production to the `build` folder.\
|
||||
It correctly bundles React in production mode and optimizes the build for the best performance.
|
||||
|
||||
The build is minified and the filenames include the hashes.\
|
||||
Your app is ready to be deployed!
|
||||
|
||||
See the section about [deployment](https://facebook.github.io/create-react-app/docs/deployment) for more information.
|
||||
|
||||
### `yarn eject`
|
||||
|
||||
**Note: this is a one-way operation. Once you `eject`, you can’t go back!**
|
||||
|
||||
If you aren’t satisfied with the build tool and configuration choices, you can `eject` at any time. This command will remove the single build dependency from your project.
|
||||
|
||||
Instead, it will copy all the configuration files and the transitive dependencies (webpack, Babel, ESLint, etc) right into your project so you have full control over them. All of the commands except `eject` will still work, but they will point to the copied scripts so you can tweak them. At this point you’re on your own.
|
||||
|
||||
You don’t have to ever use `eject`. The curated feature set is suitable for small and middle deployments, and you shouldn’t feel obligated to use this feature. However we understand that this tool wouldn’t be useful if you couldn’t customize it when you are ready for it.
|
||||
|
||||
## Learn More
|
||||
|
||||
You can learn more in the [Create React App documentation](https://facebook.github.io/create-react-app/docs/getting-started).
|
||||
|
||||
To learn React, check out the [React documentation](https://reactjs.org/).
|
|
@ -1,104 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const paths = require('./paths');
|
||||
|
||||
// Make sure that including paths.js after env.js will read .env variables.
|
||||
delete require.cache[require.resolve('./paths')];
|
||||
|
||||
const NODE_ENV = process.env.NODE_ENV;
|
||||
if (!NODE_ENV) {
|
||||
throw new Error(
|
||||
'The NODE_ENV environment variable is required but was not specified.'
|
||||
);
|
||||
}
|
||||
|
||||
// https://github.com/bkeepers/dotenv#what-other-env-files-can-i-use
|
||||
const dotenvFiles = [
|
||||
`${paths.dotenv}.${NODE_ENV}.local`,
|
||||
// Don't include `.env.local` for `test` environment
|
||||
// since normally you expect tests to produce the same
|
||||
// results for everyone
|
||||
NODE_ENV !== 'test' && `${paths.dotenv}.local`,
|
||||
`${paths.dotenv}.${NODE_ENV}`,
|
||||
paths.dotenv,
|
||||
].filter(Boolean);
|
||||
|
||||
// Load environment variables from .env* files. Suppress warnings using silent
|
||||
// if this file is missing. dotenv will never modify any environment variables
|
||||
// that have already been set. Variable expansion is supported in .env files.
|
||||
// https://github.com/motdotla/dotenv
|
||||
// https://github.com/motdotla/dotenv-expand
|
||||
dotenvFiles.forEach(dotenvFile => {
|
||||
if (fs.existsSync(dotenvFile)) {
|
||||
require('dotenv-expand')(
|
||||
require('dotenv').config({
|
||||
path: dotenvFile,
|
||||
})
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// We support resolving modules according to `NODE_PATH`.
|
||||
// This lets you use absolute paths in imports inside large monorepos:
|
||||
// https://github.com/facebook/create-react-app/issues/253.
|
||||
// It works similar to `NODE_PATH` in Node itself:
|
||||
// https://nodejs.org/api/modules.html#modules_loading_from_the_global_folders
|
||||
// Note that unlike in Node, only *relative* paths from `NODE_PATH` are honored.
|
||||
// Otherwise, we risk importing Node.js core modules into an app instead of webpack shims.
|
||||
// https://github.com/facebook/create-react-app/issues/1023#issuecomment-265344421
|
||||
// We also resolve them to make sure all tools using them work consistently.
|
||||
const appDirectory = fs.realpathSync(process.cwd());
|
||||
process.env.NODE_PATH = (process.env.NODE_PATH || '')
|
||||
.split(path.delimiter)
|
||||
.filter(folder => folder && !path.isAbsolute(folder))
|
||||
.map(folder => path.resolve(appDirectory, folder))
|
||||
.join(path.delimiter);
|
||||
|
||||
// Grab NODE_ENV and REACT_APP_* environment variables and prepare them to be
|
||||
// injected into the application via DefinePlugin in webpack configuration.
|
||||
const REACT_APP = /^REACT_APP_/i;
|
||||
|
||||
function getClientEnvironment(publicUrl) {
|
||||
const raw = Object.keys(process.env)
|
||||
.filter(key => REACT_APP.test(key))
|
||||
.reduce(
|
||||
(env, key) => {
|
||||
env[key] = process.env[key];
|
||||
return env;
|
||||
},
|
||||
{
|
||||
// Useful for determining whether we’re running in production mode.
|
||||
// Most importantly, it switches React into the correct mode.
|
||||
NODE_ENV: process.env.NODE_ENV || 'development',
|
||||
// Useful for resolving the correct path to static assets in `public`.
|
||||
// For example, <img src={process.env.PUBLIC_URL + '/img/logo.png'} />.
|
||||
// This should only be used as an escape hatch. Normally you would put
|
||||
// images into the `src` and `import` them in code to get their paths.
|
||||
PUBLIC_URL: publicUrl,
|
||||
// We support configuring the sockjs pathname during development.
|
||||
// These settings let a developer run multiple simultaneous projects.
|
||||
// They are used as the connection `hostname`, `pathname` and `port`
|
||||
// in webpackHotDevClient. They are used as the `sockHost`, `sockPath`
|
||||
// and `sockPort` options in webpack-dev-server.
|
||||
WDS_SOCKET_HOST: process.env.WDS_SOCKET_HOST,
|
||||
WDS_SOCKET_PATH: process.env.WDS_SOCKET_PATH,
|
||||
WDS_SOCKET_PORT: process.env.WDS_SOCKET_PORT,
|
||||
// Whether or not react-refresh is enabled.
|
||||
// It is defined here so it is available in the webpackHotDevClient.
|
||||
FAST_REFRESH: process.env.FAST_REFRESH !== 'false',
|
||||
}
|
||||
);
|
||||
// Stringify all values so we can feed into webpack DefinePlugin
|
||||
const stringified = {
|
||||
'process.env': Object.keys(raw).reduce((env, key) => {
|
||||
env[key] = JSON.stringify(raw[key]);
|
||||
return env;
|
||||
}, {}),
|
||||
};
|
||||
|
||||
return { raw, stringified };
|
||||
}
|
||||
|
||||
module.exports = getClientEnvironment;
|
|
@ -1,66 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
const chalk = require('react-dev-utils/chalk');
|
||||
const paths = require('./paths');
|
||||
|
||||
// Ensure the certificate and key provided are valid and if not
|
||||
// throw an easy to debug error
|
||||
function validateKeyAndCerts({ cert, key, keyFile, crtFile }) {
|
||||
let encrypted;
|
||||
try {
|
||||
// publicEncrypt will throw an error with an invalid cert
|
||||
encrypted = crypto.publicEncrypt(cert, Buffer.from('test'));
|
||||
} catch (err) {
|
||||
throw new Error(
|
||||
`The certificate "${chalk.yellow(crtFile)}" is invalid.\n${err.message}`
|
||||
);
|
||||
}
|
||||
|
||||
try {
|
||||
// privateDecrypt will throw an error with an invalid key
|
||||
crypto.privateDecrypt(key, encrypted);
|
||||
} catch (err) {
|
||||
throw new Error(
|
||||
`The certificate key "${chalk.yellow(keyFile)}" is invalid.\n${
|
||||
err.message
|
||||
}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Read file and throw an error if it doesn't exist
|
||||
function readEnvFile(file, type) {
|
||||
if (!fs.existsSync(file)) {
|
||||
throw new Error(
|
||||
`You specified ${chalk.cyan(
|
||||
type
|
||||
)} in your env, but the file "${chalk.yellow(file)}" can't be found.`
|
||||
);
|
||||
}
|
||||
return fs.readFileSync(file);
|
||||
}
|
||||
|
||||
// Get the https config
|
||||
// Return cert files if provided in env, otherwise just true or false
|
||||
function getHttpsConfig() {
|
||||
const { SSL_CRT_FILE, SSL_KEY_FILE, HTTPS } = process.env;
|
||||
const isHttps = HTTPS === 'true';
|
||||
|
||||
if (isHttps && SSL_CRT_FILE && SSL_KEY_FILE) {
|
||||
const crtFile = path.resolve(paths.appPath, SSL_CRT_FILE);
|
||||
const keyFile = path.resolve(paths.appPath, SSL_KEY_FILE);
|
||||
const config = {
|
||||
cert: readEnvFile(crtFile, 'SSL_CRT_FILE'),
|
||||
key: readEnvFile(keyFile, 'SSL_KEY_FILE'),
|
||||
};
|
||||
|
||||
validateKeyAndCerts({ ...config, keyFile, crtFile });
|
||||
return config;
|
||||
}
|
||||
return isHttps;
|
||||
}
|
||||
|
||||
module.exports = getHttpsConfig;
|
|
@ -1,29 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const babelJest = require('babel-jest').default;
|
||||
|
||||
const hasJsxRuntime = (() => {
|
||||
if (process.env.DISABLE_NEW_JSX_TRANSFORM === 'true') {
|
||||
return false;
|
||||
}
|
||||
|
||||
try {
|
||||
require.resolve('react/jsx-runtime');
|
||||
return true;
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
})();
|
||||
|
||||
module.exports = babelJest.createTransformer({
|
||||
presets: [
|
||||
[
|
||||
require.resolve('babel-preset-react-app'),
|
||||
{
|
||||
runtime: hasJsxRuntime ? 'automatic' : 'classic',
|
||||
},
|
||||
],
|
||||
],
|
||||
babelrc: false,
|
||||
configFile: false,
|
||||
});
|
|
@ -1,14 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
// This is a custom Jest transformer turning style imports into empty objects.
|
||||
// http://facebook.github.io/jest/docs/en/webpack.html
|
||||
|
||||
module.exports = {
|
||||
process() {
|
||||
return 'module.exports = {};';
|
||||
},
|
||||
getCacheKey() {
|
||||
// The output is always the same.
|
||||
return 'cssTransform';
|
||||
},
|
||||
};
|
|
@ -1,40 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const path = require('path');
|
||||
const camelcase = require('camelcase');
|
||||
|
||||
// This is a custom Jest transformer turning file imports into filenames.
|
||||
// http://facebook.github.io/jest/docs/en/webpack.html
|
||||
|
||||
module.exports = {
|
||||
process(src, filename) {
|
||||
const assetFilename = JSON.stringify(path.basename(filename));
|
||||
|
||||
if (filename.match(/\.svg$/)) {
|
||||
// Based on how SVGR generates a component name:
|
||||
// https://github.com/smooth-code/svgr/blob/01b194cf967347d43d4cbe6b434404731b87cf27/packages/core/src/state.js#L6
|
||||
const pascalCaseFilename = camelcase(path.parse(filename).name, {
|
||||
pascalCase: true,
|
||||
});
|
||||
const componentName = `Svg${pascalCaseFilename}`;
|
||||
return `const React = require('react');
|
||||
module.exports = {
|
||||
__esModule: true,
|
||||
default: ${assetFilename},
|
||||
ReactComponent: React.forwardRef(function ${componentName}(props, ref) {
|
||||
return {
|
||||
$$typeof: Symbol.for('react.element'),
|
||||
type: 'svg',
|
||||
ref: ref,
|
||||
key: null,
|
||||
props: Object.assign({}, props, {
|
||||
children: ${assetFilename}
|
||||
})
|
||||
};
|
||||
}),
|
||||
};`;
|
||||
}
|
||||
|
||||
return `module.exports = ${assetFilename};`;
|
||||
},
|
||||
};
|
|
@ -1,134 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const paths = require('./paths');
|
||||
const chalk = require('react-dev-utils/chalk');
|
||||
const resolve = require('resolve');
|
||||
|
||||
/**
|
||||
* Get additional module paths based on the baseUrl of a compilerOptions object.
|
||||
*
|
||||
* @param {Object} options
|
||||
*/
|
||||
function getAdditionalModulePaths(options = {}) {
|
||||
const baseUrl = options.baseUrl;
|
||||
|
||||
if (!baseUrl) {
|
||||
return '';
|
||||
}
|
||||
|
||||
const baseUrlResolved = path.resolve(paths.appPath, baseUrl);
|
||||
|
||||
// We don't need to do anything if `baseUrl` is set to `node_modules`. This is
|
||||
// the default behavior.
|
||||
if (path.relative(paths.appNodeModules, baseUrlResolved) === '') {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Allow the user set the `baseUrl` to `appSrc`.
|
||||
if (path.relative(paths.appSrc, baseUrlResolved) === '') {
|
||||
return [paths.appSrc];
|
||||
}
|
||||
|
||||
// If the path is equal to the root directory we ignore it here.
|
||||
// We don't want to allow importing from the root directly as source files are
|
||||
// not transpiled outside of `src`. We do allow importing them with the
|
||||
// absolute path (e.g. `src/Components/Button.js`) but we set that up with
|
||||
// an alias.
|
||||
if (path.relative(paths.appPath, baseUrlResolved) === '') {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Otherwise, throw an error.
|
||||
throw new Error(
|
||||
chalk.red.bold(
|
||||
"Your project's `baseUrl` can only be set to `src` or `node_modules`." +
|
||||
' Create React App does not support other values at this time.'
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get webpack aliases based on the baseUrl of a compilerOptions object.
|
||||
*
|
||||
* @param {*} options
|
||||
*/
|
||||
function getWebpackAliases(options = {}) {
|
||||
const baseUrl = options.baseUrl;
|
||||
|
||||
if (!baseUrl) {
|
||||
return {};
|
||||
}
|
||||
|
||||
const baseUrlResolved = path.resolve(paths.appPath, baseUrl);
|
||||
|
||||
if (path.relative(paths.appPath, baseUrlResolved) === '') {
|
||||
return {
|
||||
src: paths.appSrc,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get jest aliases based on the baseUrl of a compilerOptions object.
|
||||
*
|
||||
* @param {*} options
|
||||
*/
|
||||
function getJestAliases(options = {}) {
|
||||
const baseUrl = options.baseUrl;
|
||||
|
||||
if (!baseUrl) {
|
||||
return {};
|
||||
}
|
||||
|
||||
const baseUrlResolved = path.resolve(paths.appPath, baseUrl);
|
||||
|
||||
if (path.relative(paths.appPath, baseUrlResolved) === '') {
|
||||
return {
|
||||
'^src/(.*)$': '<rootDir>/src/$1',
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
function getModules() {
|
||||
// Check if TypeScript is setup
|
||||
const hasTsConfig = fs.existsSync(paths.appTsConfig);
|
||||
const hasJsConfig = fs.existsSync(paths.appJsConfig);
|
||||
|
||||
if (hasTsConfig && hasJsConfig) {
|
||||
throw new Error(
|
||||
'You have both a tsconfig.json and a jsconfig.json. If you are using TypeScript please remove your jsconfig.json file.'
|
||||
);
|
||||
}
|
||||
|
||||
let config;
|
||||
|
||||
// If there's a tsconfig.json we assume it's a
|
||||
// TypeScript project and set up the config
|
||||
// based on tsconfig.json
|
||||
if (hasTsConfig) {
|
||||
const ts = require(resolve.sync('typescript', {
|
||||
basedir: paths.appNodeModules,
|
||||
}));
|
||||
config = ts.readConfigFile(paths.appTsConfig, ts.sys.readFile).config;
|
||||
// Otherwise we'll check if there is jsconfig.json
|
||||
// for non TS projects.
|
||||
} else if (hasJsConfig) {
|
||||
config = require(paths.appJsConfig);
|
||||
}
|
||||
|
||||
config = config || {};
|
||||
const options = config.compilerOptions || {};
|
||||
|
||||
const additionalModulePaths = getAdditionalModulePaths(options);
|
||||
|
||||
return {
|
||||
additionalModulePaths: additionalModulePaths,
|
||||
webpackAliases: getWebpackAliases(options),
|
||||
jestAliases: getJestAliases(options),
|
||||
hasTsConfig,
|
||||
};
|
||||
}
|
||||
|
||||
module.exports = getModules();
|
|
@ -1,77 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
const getPublicUrlOrPath = require('react-dev-utils/getPublicUrlOrPath');
|
||||
|
||||
// Make sure any symlinks in the project folder are resolved:
|
||||
// https://github.com/facebook/create-react-app/issues/637
|
||||
const appDirectory = fs.realpathSync(process.cwd());
|
||||
const resolveApp = relativePath => path.resolve(appDirectory, relativePath);
|
||||
|
||||
// We use `PUBLIC_URL` environment variable or "homepage" field to infer
|
||||
// "public path" at which the app is served.
|
||||
// webpack needs to know it to put the right <script> hrefs into HTML even in
|
||||
// single-page apps that may serve index.html for nested URLs like /todos/42.
|
||||
// We can't use a relative path in HTML because we don't want to load something
|
||||
// like /todos/42/static/js/bundle.7289d.js. We have to know the root.
|
||||
const publicUrlOrPath = getPublicUrlOrPath(
|
||||
process.env.NODE_ENV === 'development',
|
||||
require(resolveApp('package.json')).homepage,
|
||||
process.env.PUBLIC_URL
|
||||
);
|
||||
|
||||
const buildPath = process.env.BUILD_PATH || 'build';
|
||||
|
||||
const moduleFileExtensions = [
|
||||
'web.mjs',
|
||||
'mjs',
|
||||
'web.js',
|
||||
'js',
|
||||
'web.ts',
|
||||
'ts',
|
||||
'web.tsx',
|
||||
'tsx',
|
||||
'json',
|
||||
'web.jsx',
|
||||
'jsx',
|
||||
];
|
||||
|
||||
// Resolve file paths in the same order as webpack
|
||||
const resolveModule = (resolveFn, filePath) => {
|
||||
const extension = moduleFileExtensions.find(extension =>
|
||||
fs.existsSync(resolveFn(`${filePath}.${extension}`))
|
||||
);
|
||||
|
||||
if (extension) {
|
||||
return resolveFn(`${filePath}.${extension}`);
|
||||
}
|
||||
|
||||
return resolveFn(`${filePath}.js`);
|
||||
};
|
||||
|
||||
// config after eject: we're in ./config/
|
||||
module.exports = {
|
||||
dotenv: resolveApp('.env'),
|
||||
appPath: resolveApp('.'),
|
||||
appBuild: resolveApp(buildPath),
|
||||
appPublic: resolveApp('public'),
|
||||
appHtml: resolveApp('public/index.html'),
|
||||
appIndexJs: resolveModule(resolveApp, 'src/index'),
|
||||
appPackageJson: resolveApp('package.json'),
|
||||
appSrc: resolveApp('src'),
|
||||
appTsConfig: resolveApp('tsconfig.json'),
|
||||
appJsConfig: resolveApp('jsconfig.json'),
|
||||
yarnLockFile: resolveApp('yarn.lock'),
|
||||
testsSetup: resolveModule(resolveApp, 'src/setupTests'),
|
||||
proxySetup: resolveApp('src/setupProxy.js'),
|
||||
appNodeModules: resolveApp('node_modules'),
|
||||
appWebpackCache: resolveApp('node_modules/.cache'),
|
||||
appTsBuildInfoFile: resolveApp('node_modules/.cache/tsconfig.tsbuildinfo'),
|
||||
swSrc: resolveModule(resolveApp, 'src/service-worker'),
|
||||
publicUrlOrPath,
|
||||
};
|
||||
|
||||
|
||||
|
||||
module.exports.moduleFileExtensions = moduleFileExtensions;
|
|
@ -1,755 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const webpack = require('webpack');
|
||||
const resolve = require('resolve');
|
||||
const HtmlWebpackPlugin = require('html-webpack-plugin');
|
||||
const CaseSensitivePathsPlugin = require('case-sensitive-paths-webpack-plugin');
|
||||
const InlineChunkHtmlPlugin = require('react-dev-utils/InlineChunkHtmlPlugin');
|
||||
const TerserPlugin = require('terser-webpack-plugin');
|
||||
const MiniCssExtractPlugin = require('mini-css-extract-plugin');
|
||||
const CssMinimizerPlugin = require('css-minimizer-webpack-plugin');
|
||||
const { WebpackManifestPlugin } = require('webpack-manifest-plugin');
|
||||
const InterpolateHtmlPlugin = require('react-dev-utils/InterpolateHtmlPlugin');
|
||||
const WorkboxWebpackPlugin = require('workbox-webpack-plugin');
|
||||
const ModuleScopePlugin = require('react-dev-utils/ModuleScopePlugin');
|
||||
const getCSSModuleLocalIdent = require('react-dev-utils/getCSSModuleLocalIdent');
|
||||
const ESLintPlugin = require('eslint-webpack-plugin');
|
||||
const paths = require('./paths');
|
||||
const modules = require('./modules');
|
||||
const getClientEnvironment = require('./env');
|
||||
const ModuleNotFoundPlugin = require('react-dev-utils/ModuleNotFoundPlugin');
|
||||
const ForkTsCheckerWebpackPlugin =
|
||||
process.env.TSC_COMPILE_ON_ERROR === 'true'
|
||||
? require('react-dev-utils/ForkTsCheckerWarningWebpackPlugin')
|
||||
: require('react-dev-utils/ForkTsCheckerWebpackPlugin');
|
||||
const ReactRefreshWebpackPlugin = require('@pmmmwh/react-refresh-webpack-plugin');
|
||||
|
||||
const createEnvironmentHash = require('./webpack/persistentCache/createEnvironmentHash');
|
||||
|
||||
// Source maps are resource heavy and can cause out of memory issue for large source files.
|
||||
const shouldUseSourceMap = process.env.GENERATE_SOURCEMAP !== 'false';
|
||||
|
||||
const reactRefreshRuntimeEntry = require.resolve('react-refresh/runtime');
|
||||
const reactRefreshWebpackPluginRuntimeEntry = require.resolve(
|
||||
'@pmmmwh/react-refresh-webpack-plugin'
|
||||
);
|
||||
const babelRuntimeEntry = require.resolve('babel-preset-react-app');
|
||||
const babelRuntimeEntryHelpers = require.resolve(
|
||||
'@babel/runtime/helpers/esm/assertThisInitialized',
|
||||
{ paths: [babelRuntimeEntry] }
|
||||
);
|
||||
const babelRuntimeRegenerator = require.resolve('@babel/runtime/regenerator', {
|
||||
paths: [babelRuntimeEntry],
|
||||
});
|
||||
|
||||
// Some apps do not need the benefits of saving a web request, so not inlining the chunk
|
||||
// makes for a smoother build process.
|
||||
const shouldInlineRuntimeChunk = process.env.INLINE_RUNTIME_CHUNK !== 'false';
|
||||
|
||||
const emitErrorsAsWarnings = process.env.ESLINT_NO_DEV_ERRORS === 'true';
|
||||
const disableESLintPlugin = process.env.DISABLE_ESLINT_PLUGIN === 'true';
|
||||
|
||||
const imageInlineSizeLimit = parseInt(
|
||||
process.env.IMAGE_INLINE_SIZE_LIMIT || '10000'
|
||||
);
|
||||
|
||||
// Check if TypeScript is setup
|
||||
const useTypeScript = fs.existsSync(paths.appTsConfig);
|
||||
|
||||
// Check if Tailwind config exists
|
||||
const useTailwind = fs.existsSync(
|
||||
path.join(paths.appPath, 'tailwind.config.js')
|
||||
);
|
||||
|
||||
// Get the path to the uncompiled service worker (if it exists).
|
||||
const swSrc = paths.swSrc;
|
||||
|
||||
// style files regexes
|
||||
const cssRegex = /\.css$/;
|
||||
const cssModuleRegex = /\.module\.css$/;
|
||||
const sassRegex = /\.(scss|sass)$/;
|
||||
const sassModuleRegex = /\.module\.(scss|sass)$/;
|
||||
|
||||
const hasJsxRuntime = (() => {
|
||||
if (process.env.DISABLE_NEW_JSX_TRANSFORM === 'true') {
|
||||
return false;
|
||||
}
|
||||
|
||||
try {
|
||||
require.resolve('react/jsx-runtime');
|
||||
return true;
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
})();
|
||||
|
||||
// This is the production and development configuration.
|
||||
// It is focused on developer experience, fast rebuilds, and a minimal bundle.
|
||||
module.exports = function (webpackEnv) {
|
||||
const isEnvDevelopment = webpackEnv === 'development';
|
||||
const isEnvProduction = webpackEnv === 'production';
|
||||
|
||||
// Variable used for enabling profiling in Production
|
||||
// passed into alias object. Uses a flag if passed into the build command
|
||||
const isEnvProductionProfile =
|
||||
isEnvProduction && process.argv.includes('--profile');
|
||||
|
||||
// We will provide `paths.publicUrlOrPath` to our app
|
||||
// as %PUBLIC_URL% in `index.html` and `process.env.PUBLIC_URL` in JavaScript.
|
||||
// Omit trailing slash as %PUBLIC_URL%/xyz looks better than %PUBLIC_URL%xyz.
|
||||
// Get environment variables to inject into our app.
|
||||
const env = getClientEnvironment(paths.publicUrlOrPath.slice(0, -1));
|
||||
|
||||
const shouldUseReactRefresh = env.raw.FAST_REFRESH;
|
||||
|
||||
// common function to get style loaders
|
||||
const getStyleLoaders = (cssOptions, preProcessor) => {
|
||||
const loaders = [
|
||||
isEnvDevelopment && require.resolve('style-loader'),
|
||||
isEnvProduction && {
|
||||
loader: MiniCssExtractPlugin.loader,
|
||||
// css is located in `static/css`, use '../../' to locate index.html folder
|
||||
// in production `paths.publicUrlOrPath` can be a relative path
|
||||
options: paths.publicUrlOrPath.startsWith('.')
|
||||
? { publicPath: '../../' }
|
||||
: {},
|
||||
},
|
||||
{
|
||||
loader: require.resolve('css-loader'),
|
||||
options: cssOptions,
|
||||
},
|
||||
{
|
||||
// Options for PostCSS as we reference these options twice
|
||||
// Adds vendor prefixing based on your specified browser support in
|
||||
// package.json
|
||||
loader: require.resolve('postcss-loader'),
|
||||
options: {
|
||||
postcssOptions: {
|
||||
// Necessary for external CSS imports to work
|
||||
// https://github.com/facebook/create-react-app/issues/2677
|
||||
ident: 'postcss',
|
||||
config: false,
|
||||
plugins: !useTailwind
|
||||
? [
|
||||
'postcss-flexbugs-fixes',
|
||||
[
|
||||
'postcss-preset-env',
|
||||
{
|
||||
autoprefixer: {
|
||||
flexbox: 'no-2009',
|
||||
},
|
||||
stage: 3,
|
||||
},
|
||||
],
|
||||
// Adds PostCSS Normalize as the reset css with default options,
|
||||
// so that it honors browserslist config in package.json
|
||||
// which in turn let's users customize the target behavior as per their needs.
|
||||
'postcss-normalize',
|
||||
]
|
||||
: [
|
||||
'tailwindcss',
|
||||
'postcss-flexbugs-fixes',
|
||||
[
|
||||
'postcss-preset-env',
|
||||
{
|
||||
autoprefixer: {
|
||||
flexbox: 'no-2009',
|
||||
},
|
||||
stage: 3,
|
||||
},
|
||||
],
|
||||
],
|
||||
},
|
||||
sourceMap: isEnvProduction ? shouldUseSourceMap : isEnvDevelopment,
|
||||
},
|
||||
},
|
||||
].filter(Boolean);
|
||||
if (preProcessor) {
|
||||
loaders.push(
|
||||
{
|
||||
loader: require.resolve('resolve-url-loader'),
|
||||
options: {
|
||||
sourceMap: isEnvProduction ? shouldUseSourceMap : isEnvDevelopment,
|
||||
root: paths.appSrc,
|
||||
},
|
||||
},
|
||||
{
|
||||
loader: require.resolve(preProcessor),
|
||||
options: {
|
||||
sourceMap: true,
|
||||
},
|
||||
}
|
||||
);
|
||||
}
|
||||
return loaders;
|
||||
};
|
||||
|
||||
return {
|
||||
target: ['browserslist'],
|
||||
// Webpack noise constrained to errors and warnings
|
||||
stats: 'errors-warnings',
|
||||
mode: isEnvProduction ? 'production' : isEnvDevelopment && 'development',
|
||||
// Stop compilation early in production
|
||||
bail: isEnvProduction,
|
||||
devtool: isEnvProduction
|
||||
? shouldUseSourceMap
|
||||
? 'source-map'
|
||||
: false
|
||||
: isEnvDevelopment && 'cheap-module-source-map',
|
||||
// These are the "entry points" to our application.
|
||||
// This means they will be the "root" imports that are included in JS bundle.
|
||||
entry: paths.appIndexJs,
|
||||
output: {
|
||||
// The build folder.
|
||||
path: paths.appBuild,
|
||||
// Add /* filename */ comments to generated require()s in the output.
|
||||
pathinfo: isEnvDevelopment,
|
||||
// There will be one main bundle, and one file per asynchronous chunk.
|
||||
// In development, it does not produce real files.
|
||||
filename: isEnvProduction
|
||||
? 'static/js/[name].[contenthash:8].js'
|
||||
: isEnvDevelopment && 'static/js/bundle.js',
|
||||
// There are also additional JS chunk files if you use code splitting.
|
||||
chunkFilename: isEnvProduction
|
||||
? 'static/js/[name].[contenthash:8].chunk.js'
|
||||
: isEnvDevelopment && 'static/js/[name].chunk.js',
|
||||
assetModuleFilename: 'static/media/[name].[hash][ext]',
|
||||
// webpack uses `publicPath` to determine where the app is being served from.
|
||||
// It requires a trailing slash, or the file assets will get an incorrect path.
|
||||
// We inferred the "public path" (such as / or /my-project) from homepage.
|
||||
publicPath: paths.publicUrlOrPath,
|
||||
// Point sourcemap entries to original disk location (format as URL on Windows)
|
||||
devtoolModuleFilenameTemplate: isEnvProduction
|
||||
? info =>
|
||||
path
|
||||
.relative(paths.appSrc, info.absoluteResourcePath)
|
||||
.replace(/\\/g, '/')
|
||||
: isEnvDevelopment &&
|
||||
(info => path.resolve(info.absoluteResourcePath).replace(/\\/g, '/')),
|
||||
},
|
||||
cache: {
|
||||
type: 'filesystem',
|
||||
version: createEnvironmentHash(env.raw),
|
||||
cacheDirectory: paths.appWebpackCache,
|
||||
store: 'pack',
|
||||
buildDependencies: {
|
||||
defaultWebpack: ['webpack/lib/'],
|
||||
config: [__filename],
|
||||
tsconfig: [paths.appTsConfig, paths.appJsConfig].filter(f =>
|
||||
fs.existsSync(f)
|
||||
),
|
||||
},
|
||||
},
|
||||
infrastructureLogging: {
|
||||
level: 'none',
|
||||
},
|
||||
optimization: {
|
||||
minimize: isEnvProduction,
|
||||
minimizer: [
|
||||
// This is only used in production mode
|
||||
new TerserPlugin({
|
||||
terserOptions: {
|
||||
parse: {
|
||||
// We want terser to parse ecma 8 code. However, we don't want it
|
||||
// to apply any minification steps that turns valid ecma 5 code
|
||||
// into invalid ecma 5 code. This is why the 'compress' and 'output'
|
||||
// sections only apply transformations that are ecma 5 safe
|
||||
// https://github.com/facebook/create-react-app/pull/4234
|
||||
ecma: 8,
|
||||
},
|
||||
compress: {
|
||||
ecma: 5,
|
||||
warnings: false,
|
||||
// Disabled because of an issue with Uglify breaking seemingly valid code:
|
||||
// https://github.com/facebook/create-react-app/issues/2376
|
||||
// Pending further investigation:
|
||||
// https://github.com/mishoo/UglifyJS2/issues/2011
|
||||
comparisons: false,
|
||||
// Disabled because of an issue with Terser breaking valid code:
|
||||
// https://github.com/facebook/create-react-app/issues/5250
|
||||
// Pending further investigation:
|
||||
// https://github.com/terser-js/terser/issues/120
|
||||
inline: 2,
|
||||
},
|
||||
mangle: {
|
||||
safari10: true,
|
||||
},
|
||||
// Added for profiling in devtools
|
||||
keep_classnames: isEnvProductionProfile,
|
||||
keep_fnames: isEnvProductionProfile,
|
||||
output: {
|
||||
ecma: 5,
|
||||
comments: false,
|
||||
// Turned on because emoji and regex is not minified properly using default
|
||||
// https://github.com/facebook/create-react-app/issues/2488
|
||||
ascii_only: true,
|
||||
},
|
||||
},
|
||||
}),
|
||||
// This is only used in production mode
|
||||
new CssMinimizerPlugin(),
|
||||
],
|
||||
},
|
||||
resolve: {
|
||||
// This allows you to set a fallback for where webpack should look for modules.
|
||||
// We placed these paths second because we want `node_modules` to "win"
|
||||
// if there are any conflicts. This matches Node resolution mechanism.
|
||||
// https://github.com/facebook/create-react-app/issues/253
|
||||
modules: ['node_modules', paths.appNodeModules].concat(
|
||||
modules.additionalModulePaths || []
|
||||
),
|
||||
// These are the reasonable defaults supported by the Node ecosystem.
|
||||
// We also include JSX as a common component filename extension to support
|
||||
// some tools, although we do not recommend using it, see:
|
||||
// https://github.com/facebook/create-react-app/issues/290
|
||||
// `web` extension prefixes have been added for better support
|
||||
// for React Native Web.
|
||||
extensions: paths.moduleFileExtensions
|
||||
.map(ext => `.${ext}`)
|
||||
.filter(ext => useTypeScript || !ext.includes('ts')),
|
||||
alias: {
|
||||
// Support React Native Web
|
||||
// https://www.smashingmagazine.com/2016/08/a-glimpse-into-the-future-with-react-native-for-web/
|
||||
'react-native': 'react-native-web',
|
||||
// Allows for better profiling with ReactDevTools
|
||||
...(isEnvProductionProfile && {
|
||||
'react-dom$': 'react-dom/profiling',
|
||||
'scheduler/tracing': 'scheduler/tracing-profiling',
|
||||
}),
|
||||
...(modules.webpackAliases || {}),
|
||||
},
|
||||
plugins: [
|
||||
// Prevents users from importing files from outside of src/ (or node_modules/).
|
||||
// This often causes confusion because we only process files within src/ with babel.
|
||||
// To fix this, we prevent you from importing files out of src/ -- if you'd like to,
|
||||
// please link the files into your node_modules/ and let module-resolution kick in.
|
||||
// Make sure your source files are compiled, as they will not be processed in any way.
|
||||
new ModuleScopePlugin(paths.appSrc, [
|
||||
paths.appPackageJson,
|
||||
reactRefreshRuntimeEntry,
|
||||
reactRefreshWebpackPluginRuntimeEntry,
|
||||
babelRuntimeEntry,
|
||||
babelRuntimeEntryHelpers,
|
||||
babelRuntimeRegenerator,
|
||||
]),
|
||||
],
|
||||
},
|
||||
module: {
|
||||
strictExportPresence: true,
|
||||
rules: [
|
||||
// Handle node_modules packages that contain sourcemaps
|
||||
shouldUseSourceMap && {
|
||||
enforce: 'pre',
|
||||
exclude: /@babel(?:\/|\\{1,2})runtime/,
|
||||
test: /\.(js|mjs|jsx|ts|tsx|css)$/,
|
||||
loader: require.resolve('source-map-loader'),
|
||||
},
|
||||
{
|
||||
// "oneOf" will traverse all following loaders until one will
|
||||
// match the requirements. When no loader matches it will fall
|
||||
// back to the "file" loader at the end of the loader list.
|
||||
oneOf: [
|
||||
// TODO: Merge this config once `image/avif` is in the mime-db
|
||||
// https://github.com/jshttp/mime-db
|
||||
{
|
||||
test: [/\.avif$/],
|
||||
type: 'asset',
|
||||
mimetype: 'image/avif',
|
||||
parser: {
|
||||
dataUrlCondition: {
|
||||
maxSize: imageInlineSizeLimit,
|
||||
},
|
||||
},
|
||||
},
|
||||
// "url" loader works like "file" loader except that it embeds assets
|
||||
// smaller than specified limit in bytes as data URLs to avoid requests.
|
||||
// A missing `test` is equivalent to a match.
|
||||
{
|
||||
test: [/\.bmp$/, /\.gif$/, /\.jpe?g$/, /\.png$/],
|
||||
type: 'asset',
|
||||
parser: {
|
||||
dataUrlCondition: {
|
||||
maxSize: imageInlineSizeLimit,
|
||||
},
|
||||
},
|
||||
},
|
||||
{
|
||||
test: /\.svg$/,
|
||||
use: [
|
||||
{
|
||||
loader: require.resolve('@svgr/webpack'),
|
||||
options: {
|
||||
prettier: false,
|
||||
svgo: false,
|
||||
svgoConfig: {
|
||||
plugins: [{ removeViewBox: false }],
|
||||
},
|
||||
titleProp: true,
|
||||
ref: true,
|
||||
},
|
||||
},
|
||||
{
|
||||
loader: require.resolve('file-loader'),
|
||||
options: {
|
||||
name: 'static/media/[name].[hash].[ext]',
|
||||
},
|
||||
},
|
||||
],
|
||||
issuer: {
|
||||
and: [/\.(ts|tsx|js|jsx|md|mdx)$/],
|
||||
},
|
||||
},
|
||||
// Process application JS with Babel.
|
||||
// The preset includes JSX, Flow, TypeScript, and some ESnext features.
|
||||
{
|
||||
test: /\.(js|mjs|jsx|ts|tsx)$/,
|
||||
include: paths.appSrc,
|
||||
loader: require.resolve('babel-loader'),
|
||||
options: {
|
||||
customize: require.resolve(
|
||||
'babel-preset-react-app/webpack-overrides'
|
||||
),
|
||||
presets: [
|
||||
[
|
||||
require.resolve('babel-preset-react-app'),
|
||||
{
|
||||
runtime: hasJsxRuntime ? 'automatic' : 'classic',
|
||||
},
|
||||
],
|
||||
],
|
||||
|
||||
plugins: [
|
||||
isEnvDevelopment &&
|
||||
shouldUseReactRefresh &&
|
||||
require.resolve('react-refresh/babel'),
|
||||
].filter(Boolean),
|
||||
// This is a feature of `babel-loader` for webpack (not Babel itself).
|
||||
// It enables caching results in ./node_modules/.cache/babel-loader/
|
||||
// directory for faster rebuilds.
|
||||
cacheDirectory: true,
|
||||
// See #6846 for context on why cacheCompression is disabled
|
||||
cacheCompression: false,
|
||||
compact: isEnvProduction,
|
||||
},
|
||||
},
|
||||
// Process any JS outside of the app with Babel.
|
||||
// Unlike the application JS, we only compile the standard ES features.
|
||||
{
|
||||
test: /\.(js|mjs)$/,
|
||||
exclude: /@babel(?:\/|\\{1,2})runtime/,
|
||||
loader: require.resolve('babel-loader'),
|
||||
options: {
|
||||
babelrc: false,
|
||||
configFile: false,
|
||||
compact: false,
|
||||
presets: [
|
||||
[
|
||||
require.resolve('babel-preset-react-app/dependencies'),
|
||||
{ helpers: true },
|
||||
],
|
||||
],
|
||||
cacheDirectory: true,
|
||||
// See #6846 for context on why cacheCompression is disabled
|
||||
cacheCompression: false,
|
||||
|
||||
// Babel sourcemaps are needed for debugging into node_modules
|
||||
// code. Without the options below, debuggers like VSCode
|
||||
// show incorrect code and set breakpoints on the wrong lines.
|
||||
sourceMaps: shouldUseSourceMap,
|
||||
inputSourceMap: shouldUseSourceMap,
|
||||
},
|
||||
},
|
||||
// "postcss" loader applies autoprefixer to our CSS.
|
||||
// "css" loader resolves paths in CSS and adds assets as dependencies.
|
||||
// "style" loader turns CSS into JS modules that inject <style> tags.
|
||||
// In production, we use MiniCSSExtractPlugin to extract that CSS
|
||||
// to a file, but in development "style" loader enables hot editing
|
||||
// of CSS.
|
||||
// By default we support CSS Modules with the extension .module.css
|
||||
{
|
||||
test: cssRegex,
|
||||
exclude: cssModuleRegex,
|
||||
use: getStyleLoaders({
|
||||
importLoaders: 1,
|
||||
sourceMap: isEnvProduction
|
||||
? shouldUseSourceMap
|
||||
: isEnvDevelopment,
|
||||
modules: {
|
||||
mode: 'icss',
|
||||
},
|
||||
}),
|
||||
// Don't consider CSS imports dead code even if the
|
||||
// containing package claims to have no side effects.
|
||||
// Remove this when webpack adds a warning or an error for this.
|
||||
// See https://github.com/webpack/webpack/issues/6571
|
||||
sideEffects: true,
|
||||
},
|
||||
// Adds support for CSS Modules (https://github.com/css-modules/css-modules)
|
||||
// using the extension .module.css
|
||||
{
|
||||
test: cssModuleRegex,
|
||||
use: getStyleLoaders({
|
||||
importLoaders: 1,
|
||||
sourceMap: isEnvProduction
|
||||
? shouldUseSourceMap
|
||||
: isEnvDevelopment,
|
||||
modules: {
|
||||
mode: 'local',
|
||||
getLocalIdent: getCSSModuleLocalIdent,
|
||||
},
|
||||
}),
|
||||
},
|
||||
// Opt-in support for SASS (using .scss or .sass extensions).
|
||||
// By default we support SASS Modules with the
|
||||
// extensions .module.scss or .module.sass
|
||||
{
|
||||
test: sassRegex,
|
||||
exclude: sassModuleRegex,
|
||||
use: getStyleLoaders(
|
||||
{
|
||||
importLoaders: 3,
|
||||
sourceMap: isEnvProduction
|
||||
? shouldUseSourceMap
|
||||
: isEnvDevelopment,
|
||||
modules: {
|
||||
mode: 'icss',
|
||||
},
|
||||
},
|
||||
'sass-loader'
|
||||
),
|
||||
// Don't consider CSS imports dead code even if the
|
||||
// containing package claims to have no side effects.
|
||||
// Remove this when webpack adds a warning or an error for this.
|
||||
// See https://github.com/webpack/webpack/issues/6571
|
||||
sideEffects: true,
|
||||
},
|
||||
// Adds support for CSS Modules, but using SASS
|
||||
// using the extension .module.scss or .module.sass
|
||||
{
|
||||
test: sassModuleRegex,
|
||||
use: getStyleLoaders(
|
||||
{
|
||||
importLoaders: 3,
|
||||
sourceMap: isEnvProduction
|
||||
? shouldUseSourceMap
|
||||
: isEnvDevelopment,
|
||||
modules: {
|
||||
mode: 'local',
|
||||
getLocalIdent: getCSSModuleLocalIdent,
|
||||
},
|
||||
},
|
||||
'sass-loader'
|
||||
),
|
||||
},
|
||||
// "file" loader makes sure those assets get served by WebpackDevServer.
|
||||
// When you `import` an asset, you get its (virtual) filename.
|
||||
// In production, they would get copied to the `build` folder.
|
||||
// This loader doesn't use a "test" so it will catch all modules
|
||||
// that fall through the other loaders.
|
||||
{
|
||||
// Exclude `js` files to keep "css" loader working as it injects
|
||||
// its runtime that would otherwise be processed through "file" loader.
|
||||
// Also exclude `html` and `json` extensions so they get processed
|
||||
// by webpacks internal loaders.
|
||||
exclude: [/^$/, /\.(js|mjs|jsx|ts|tsx)$/, /\.html$/, /\.json$/],
|
||||
type: 'asset/resource',
|
||||
},
|
||||
// ** STOP ** Are you adding a new loader?
|
||||
// Make sure to add the new loader(s) before the "file" loader.
|
||||
],
|
||||
},
|
||||
].filter(Boolean),
|
||||
},
|
||||
plugins: [
|
||||
// Generates an `index.html` file with the <script> injected.
|
||||
new HtmlWebpackPlugin(
|
||||
Object.assign(
|
||||
{},
|
||||
{
|
||||
inject: true,
|
||||
template: paths.appHtml,
|
||||
},
|
||||
isEnvProduction
|
||||
? {
|
||||
minify: {
|
||||
removeComments: true,
|
||||
collapseWhitespace: true,
|
||||
removeRedundantAttributes: true,
|
||||
useShortDoctype: true,
|
||||
removeEmptyAttributes: true,
|
||||
removeStyleLinkTypeAttributes: true,
|
||||
keepClosingSlash: true,
|
||||
minifyJS: true,
|
||||
minifyCSS: true,
|
||||
minifyURLs: true,
|
||||
},
|
||||
}
|
||||
: undefined
|
||||
)
|
||||
),
|
||||
// Inlines the webpack runtime script. This script is too small to warrant
|
||||
// a network request.
|
||||
// https://github.com/facebook/create-react-app/issues/5358
|
||||
isEnvProduction &&
|
||||
shouldInlineRuntimeChunk &&
|
||||
new InlineChunkHtmlPlugin(HtmlWebpackPlugin, [/runtime-.+[.]js/]),
|
||||
// Makes some environment variables available in index.html.
|
||||
// The public URL is available as %PUBLIC_URL% in index.html, e.g.:
|
||||
// <link rel="icon" href="%PUBLIC_URL%/favicon.ico">
|
||||
// It will be an empty string unless you specify "homepage"
|
||||
// in `package.json`, in which case it will be the pathname of that URL.
|
||||
new InterpolateHtmlPlugin(HtmlWebpackPlugin, env.raw),
|
||||
// This gives some necessary context to module not found errors, such as
|
||||
// the requesting resource.
|
||||
new ModuleNotFoundPlugin(paths.appPath),
|
||||
// Makes some environment variables available to the JS code, for example:
|
||||
// if (process.env.NODE_ENV === 'production') { ... }. See `./env.js`.
|
||||
// It is absolutely essential that NODE_ENV is set to production
|
||||
// during a production build.
|
||||
// Otherwise React will be compiled in the very slow development mode.
|
||||
new webpack.DefinePlugin(env.stringified),
|
||||
// Experimental hot reloading for React .
|
||||
// https://github.com/facebook/react/tree/main/packages/react-refresh
|
||||
isEnvDevelopment &&
|
||||
shouldUseReactRefresh &&
|
||||
new ReactRefreshWebpackPlugin({
|
||||
overlay: false,
|
||||
}),
|
||||
// Watcher doesn't work well if you mistype casing in a path so we use
|
||||
// a plugin that prints an error when you attempt to do this.
|
||||
// See https://github.com/facebook/create-react-app/issues/240
|
||||
isEnvDevelopment && new CaseSensitivePathsPlugin(),
|
||||
isEnvProduction &&
|
||||
new MiniCssExtractPlugin({
|
||||
// Options similar to the same options in webpackOptions.output
|
||||
// both options are optional
|
||||
filename: 'static/css/[name].[contenthash:8].css',
|
||||
chunkFilename: 'static/css/[name].[contenthash:8].chunk.css',
|
||||
}),
|
||||
// Generate an asset manifest file with the following content:
|
||||
// - "files" key: Mapping of all asset filenames to their corresponding
|
||||
// output file so that tools can pick it up without having to parse
|
||||
// `index.html`
|
||||
// - "entrypoints" key: Array of files which are included in `index.html`,
|
||||
// can be used to reconstruct the HTML if necessary
|
||||
new WebpackManifestPlugin({
|
||||
fileName: 'asset-manifest.json',
|
||||
publicPath: paths.publicUrlOrPath,
|
||||
generate: (seed, files, entrypoints) => {
|
||||
const manifestFiles = files.reduce((manifest, file) => {
|
||||
manifest[file.name] = file.path;
|
||||
return manifest;
|
||||
}, seed);
|
||||
const entrypointFiles = entrypoints.main.filter(
|
||||
fileName => !fileName.endsWith('.map')
|
||||
);
|
||||
|
||||
return {
|
||||
files: manifestFiles,
|
||||
entrypoints: entrypointFiles,
|
||||
};
|
||||
},
|
||||
}),
|
||||
// Moment.js is an extremely popular library that bundles large locale files
|
||||
// by default due to how webpack interprets its code. This is a practical
|
||||
// solution that requires the user to opt into importing specific locales.
|
||||
// https://github.com/jmblog/how-to-optimize-momentjs-with-webpack
|
||||
// You can remove this if you don't use Moment.js:
|
||||
new webpack.IgnorePlugin({
|
||||
resourceRegExp: /^\.\/locale$/,
|
||||
contextRegExp: /moment$/,
|
||||
}),
|
||||
// Generate a service worker script that will precache, and keep up to date,
|
||||
// the HTML & assets that are part of the webpack build.
|
||||
isEnvProduction &&
|
||||
fs.existsSync(swSrc) &&
|
||||
new WorkboxWebpackPlugin.InjectManifest({
|
||||
swSrc,
|
||||
dontCacheBustURLsMatching: /\.[0-9a-f]{8}\./,
|
||||
exclude: [/\.map$/, /asset-manifest\.json$/, /LICENSE/],
|
||||
// Bump up the default maximum size (2mb) that's precached,
|
||||
// to make lazy-loading failure scenarios less likely.
|
||||
// See https://github.com/cra-template/pwa/issues/13#issuecomment-722667270
|
||||
maximumFileSizeToCacheInBytes: 5 * 1024 * 1024,
|
||||
}),
|
||||
// TypeScript type checking
|
||||
useTypeScript &&
|
||||
new ForkTsCheckerWebpackPlugin({
|
||||
async: isEnvDevelopment,
|
||||
typescript: {
|
||||
typescriptPath: resolve.sync('typescript', {
|
||||
basedir: paths.appNodeModules,
|
||||
}),
|
||||
configOverwrite: {
|
||||
compilerOptions: {
|
||||
sourceMap: isEnvProduction
|
||||
? shouldUseSourceMap
|
||||
: isEnvDevelopment,
|
||||
skipLibCheck: true,
|
||||
inlineSourceMap: false,
|
||||
declarationMap: false,
|
||||
noEmit: true,
|
||||
incremental: true,
|
||||
tsBuildInfoFile: paths.appTsBuildInfoFile,
|
||||
},
|
||||
},
|
||||
context: paths.appPath,
|
||||
diagnosticOptions: {
|
||||
syntactic: true,
|
||||
},
|
||||
mode: 'write-references',
|
||||
// profile: true,
|
||||
},
|
||||
issue: {
|
||||
// This one is specifically to match during CI tests,
|
||||
// as micromatch doesn't match
|
||||
// '../cra-template-typescript/template/src/App.tsx'
|
||||
// otherwise.
|
||||
include: [
|
||||
{ file: '../**/src/**/*.{ts,tsx}' },
|
||||
{ file: '**/src/**/*.{ts,tsx}' },
|
||||
],
|
||||
exclude: [
|
||||
{ file: '**/src/**/__tests__/**' },
|
||||
{ file: '**/src/**/?(*.){spec|test}.*' },
|
||||
{ file: '**/src/setupProxy.*' },
|
||||
{ file: '**/src/setupTests.*' },
|
||||
],
|
||||
},
|
||||
logger: {
|
||||
infrastructure: 'silent',
|
||||
},
|
||||
}),
|
||||
!disableESLintPlugin &&
|
||||
new ESLintPlugin({
|
||||
// Plugin options
|
||||
extensions: ['js', 'mjs', 'jsx', 'ts', 'tsx'],
|
||||
formatter: require.resolve('react-dev-utils/eslintFormatter'),
|
||||
eslintPath: require.resolve('eslint'),
|
||||
failOnError: !(isEnvDevelopment && emitErrorsAsWarnings),
|
||||
context: paths.appSrc,
|
||||
cache: true,
|
||||
cacheLocation: path.resolve(
|
||||
paths.appNodeModules,
|
||||
'.cache/.eslintcache'
|
||||
),
|
||||
// ESLint class options
|
||||
cwd: paths.appPath,
|
||||
resolvePluginsRelativeTo: __dirname,
|
||||
baseConfig: {
|
||||
extends: [require.resolve('eslint-config-react-app/base')],
|
||||
rules: {
|
||||
...(!hasJsxRuntime && {
|
||||
'react/react-in-jsx-scope': 'error',
|
||||
}),
|
||||
},
|
||||
},
|
||||
}),
|
||||
].filter(Boolean),
|
||||
// Turn off performance processing because we utilize
|
||||
// our own hints via the FileSizeReporter
|
||||
performance: false,
|
||||
};
|
||||
};
|
|
@ -1,9 +0,0 @@
|
|||
'use strict';
|
||||
const { createHash } = require('crypto');
|
||||
|
||||
module.exports = env => {
|
||||
const hash = createHash('md5');
|
||||
hash.update(JSON.stringify(env));
|
||||
|
||||
return hash.digest('hex');
|
||||
};
|
|
@ -1,127 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const evalSourceMapMiddleware = require('react-dev-utils/evalSourceMapMiddleware');
|
||||
const noopServiceWorkerMiddleware = require('react-dev-utils/noopServiceWorkerMiddleware');
|
||||
const ignoredFiles = require('react-dev-utils/ignoredFiles');
|
||||
const redirectServedPath = require('react-dev-utils/redirectServedPathMiddleware');
|
||||
const paths = require('./paths');
|
||||
const getHttpsConfig = require('./getHttpsConfig');
|
||||
|
||||
const host = process.env.HOST || '0.0.0.0';
|
||||
const sockHost = process.env.WDS_SOCKET_HOST;
|
||||
const sockPath = process.env.WDS_SOCKET_PATH; // default: '/ws'
|
||||
const sockPort = process.env.WDS_SOCKET_PORT;
|
||||
|
||||
module.exports = function (proxy, allowedHost) {
|
||||
const disableFirewall =
|
||||
!proxy || process.env.DANGEROUSLY_DISABLE_HOST_CHECK === 'true';
|
||||
return {
|
||||
// WebpackDevServer 2.4.3 introduced a security fix that prevents remote
|
||||
// websites from potentially accessing local content through DNS rebinding:
|
||||
// https://github.com/webpack/webpack-dev-server/issues/887
|
||||
// https://medium.com/webpack/webpack-dev-server-middleware-security-issues-1489d950874a
|
||||
// However, it made several existing use cases such as development in cloud
|
||||
// environment or subdomains in development significantly more complicated:
|
||||
// https://github.com/facebook/create-react-app/issues/2271
|
||||
// https://github.com/facebook/create-react-app/issues/2233
|
||||
// While we're investigating better solutions, for now we will take a
|
||||
// compromise. Since our WDS configuration only serves files in the `public`
|
||||
// folder we won't consider accessing them a vulnerability. However, if you
|
||||
// use the `proxy` feature, it gets more dangerous because it can expose
|
||||
// remote code execution vulnerabilities in backends like Django and Rails.
|
||||
// So we will disable the host check normally, but enable it if you have
|
||||
// specified the `proxy` setting. Finally, we let you override it if you
|
||||
// really know what you're doing with a special environment variable.
|
||||
// Note: ["localhost", ".localhost"] will support subdomains - but we might
|
||||
// want to allow setting the allowedHosts manually for more complex setups
|
||||
allowedHosts: disableFirewall ? 'all' : [allowedHost],
|
||||
headers: {
|
||||
'Access-Control-Allow-Origin': '*',
|
||||
'Access-Control-Allow-Methods': '*',
|
||||
'Access-Control-Allow-Headers': '*',
|
||||
},
|
||||
// Enable gzip compression of generated files.
|
||||
compress: true,
|
||||
static: {
|
||||
// By default WebpackDevServer serves physical files from current directory
|
||||
// in addition to all the virtual build products that it serves from memory.
|
||||
// This is confusing because those files won’t automatically be available in
|
||||
// production build folder unless we copy them. However, copying the whole
|
||||
// project directory is dangerous because we may expose sensitive files.
|
||||
// Instead, we establish a convention that only files in `public` directory
|
||||
// get served. Our build script will copy `public` into the `build` folder.
|
||||
// In `index.html`, you can get URL of `public` folder with %PUBLIC_URL%:
|
||||
// <link rel="icon" href="%PUBLIC_URL%/favicon.ico">
|
||||
// In JavaScript code, you can access it with `process.env.PUBLIC_URL`.
|
||||
// Note that we only recommend to use `public` folder as an escape hatch
|
||||
// for files like `favicon.ico`, `manifest.json`, and libraries that are
|
||||
// for some reason broken when imported through webpack. If you just want to
|
||||
// use an image, put it in `src` and `import` it from JavaScript instead.
|
||||
directory: paths.appPublic,
|
||||
publicPath: [paths.publicUrlOrPath],
|
||||
// By default files from `contentBase` will not trigger a page reload.
|
||||
watch: {
|
||||
// Reportedly, this avoids CPU overload on some systems.
|
||||
// https://github.com/facebook/create-react-app/issues/293
|
||||
// src/node_modules is not ignored to support absolute imports
|
||||
// https://github.com/facebook/create-react-app/issues/1065
|
||||
ignored: ignoredFiles(paths.appSrc),
|
||||
},
|
||||
},
|
||||
client: {
|
||||
webSocketURL: {
|
||||
// Enable custom sockjs pathname for websocket connection to hot reloading server.
|
||||
// Enable custom sockjs hostname, pathname and port for websocket connection
|
||||
// to hot reloading server.
|
||||
hostname: sockHost,
|
||||
pathname: sockPath,
|
||||
port: sockPort,
|
||||
},
|
||||
overlay: {
|
||||
errors: true,
|
||||
warnings: false,
|
||||
},
|
||||
},
|
||||
devMiddleware: {
|
||||
// It is important to tell WebpackDevServer to use the same "publicPath" path as
|
||||
// we specified in the webpack config. When homepage is '.', default to serving
|
||||
// from the root.
|
||||
// remove last slash so user can land on `/test` instead of `/test/`
|
||||
publicPath: paths.publicUrlOrPath.slice(0, -1),
|
||||
},
|
||||
|
||||
https: getHttpsConfig(),
|
||||
host,
|
||||
historyApiFallback: {
|
||||
// Paths with dots should still use the history fallback.
|
||||
// See https://github.com/facebook/create-react-app/issues/387.
|
||||
disableDotRule: true,
|
||||
index: paths.publicUrlOrPath,
|
||||
},
|
||||
// `proxy` is run between `before` and `after` `webpack-dev-server` hooks
|
||||
proxy,
|
||||
onBeforeSetupMiddleware(devServer) {
|
||||
// Keep `evalSourceMapMiddleware`
|
||||
// middlewares before `redirectServedPath` otherwise will not have any effect
|
||||
// This lets us fetch source contents from webpack for the error overlay
|
||||
devServer.app.use(evalSourceMapMiddleware(devServer));
|
||||
|
||||
if (fs.existsSync(paths.proxySetup)) {
|
||||
// This registers user provided middleware for proxy reasons
|
||||
require(paths.proxySetup)(devServer.app);
|
||||
}
|
||||
},
|
||||
onAfterSetupMiddleware(devServer) {
|
||||
// Redirect to `PUBLIC_URL` or `homepage` from `package.json` if url not match
|
||||
devServer.app.use(redirectServedPath(paths.publicUrlOrPath));
|
||||
|
||||
// This service worker file is effectively a 'no-op' that will reset any
|
||||
// previous service worker registered for the same host:port combination.
|
||||
// We do this in development to avoid hitting the production cache if
|
||||
// it used the same host and port.
|
||||
// https://github.com/facebook/create-react-app/issues/2272#issuecomment-302832432
|
||||
devServer.app.use(noopServiceWorkerMiddleware(paths.publicUrlOrPath));
|
||||
},
|
||||
};
|
||||
};
|
17226
asp-review-app/ui/package-lock.json
generated
17226
asp-review-app/ui/package-lock.json
generated
File diff suppressed because it is too large
Load diff
|
@ -1,165 +0,0 @@
|
|||
{
|
||||
"name": "i18next",
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"dependencies": {
|
||||
"@babel/core": "^7.16.0",
|
||||
"@pmmmwh/react-refresh-webpack-plugin": "^0.5.3",
|
||||
"@react-three/drei": "^9.65.3",
|
||||
"@svgr/webpack": "^5.5.0",
|
||||
"@testing-library/jest-dom": "^5.14.1",
|
||||
"@testing-library/react": "^13.0.0",
|
||||
"@testing-library/user-event": "^13.2.1",
|
||||
"@types/jest": "^27.0.1",
|
||||
"@types/node": "^16.7.13",
|
||||
"@types/react": "18.0.25",
|
||||
"@types/react-dom": "18.0.9",
|
||||
"antd": "^5.5.2",
|
||||
"babel-jest": "^27.4.2",
|
||||
"babel-loader": "^8.2.3",
|
||||
"babel-plugin-named-asset-import": "^0.3.8",
|
||||
"babel-preset-react-app": "^10.0.1",
|
||||
"bfj": "^7.0.2",
|
||||
"browserslist": "^4.18.1",
|
||||
"camelcase": "^6.2.1",
|
||||
"case-sensitive-paths-webpack-plugin": "^2.4.0",
|
||||
"css-loader": "^6.5.1",
|
||||
"css-minimizer-webpack-plugin": "^3.2.0",
|
||||
"dotenv": "^10.0.0",
|
||||
"dotenv-expand": "^5.1.0",
|
||||
"eslint": "^8.3.0",
|
||||
"eslint-config-react-app": "^7.0.1",
|
||||
"eslint-webpack-plugin": "^3.1.1",
|
||||
"file-loader": "^6.2.0",
|
||||
"fs-extra": "^10.0.0",
|
||||
"html-webpack-plugin": "^5.5.0",
|
||||
"i18next": "^22.4.14",
|
||||
"i18next-browser-languagedetector": "^7.0.1",
|
||||
"identity-obj-proxy": "^3.0.0",
|
||||
"jest": "^27.4.3",
|
||||
"jest-resolve": "^27.4.2",
|
||||
"jest-watch-typeahead": "^1.0.0",
|
||||
"localforage": "^1.10.0",
|
||||
"match-sorter": "^6.3.1",
|
||||
"mini-css-extract-plugin": "^2.4.5",
|
||||
"mobx": "^6.9.0",
|
||||
"mobx-react": "^7.6.0",
|
||||
"postcss": "^8.4.4",
|
||||
"postcss-flexbugs-fixes": "^5.0.2",
|
||||
"postcss-loader": "^6.2.1",
|
||||
"postcss-normalize": "^10.0.1",
|
||||
"postcss-preset-env": "^7.0.1",
|
||||
"prompts": "^2.4.2",
|
||||
"react": "18.0.0",
|
||||
"react-app-polyfill": "^3.0.0",
|
||||
"react-dev-utils": "^12.0.1",
|
||||
"react-dom": "18.0.0",
|
||||
"react-i18next": "^12.2.0",
|
||||
"react-refresh": "^0.11.0",
|
||||
"react-router-dom": "^6.11.2",
|
||||
"react-three-fiber": "^6.0.13",
|
||||
"resolve": "^1.20.0",
|
||||
"resolve-url-loader": "^4.0.0",
|
||||
"rete": "2.0.0-beta.9",
|
||||
"rete-area-plugin": "2.0.0-beta.12",
|
||||
"rete-connection-plugin": "2.0.0-beta.16",
|
||||
"rete-react-render-plugin": "2.0.0-beta.22",
|
||||
"rete-render-utils": "2.0.0-beta.12",
|
||||
"sass-loader": "^12.3.0",
|
||||
"semver": "^7.3.5",
|
||||
"sort-by": "^1.2.0",
|
||||
"source-map-loader": "^3.0.0",
|
||||
"style-loader": "^3.3.1",
|
||||
"tailwindcss": "^3.0.2",
|
||||
"terser-webpack-plugin": "^5.2.5",
|
||||
"three": "^0.151.3",
|
||||
"typescript": "^4.4.2",
|
||||
"web-vitals": "^2.1.0",
|
||||
"webpack": "^5.64.4",
|
||||
"webpack-dev-server": "^4.6.0",
|
||||
"webpack-manifest-plugin": "^4.0.2",
|
||||
"workbox-webpack-plugin": "^6.4.1"
|
||||
},
|
||||
"scripts": {
|
||||
"dev": "node scripts/start.js",
|
||||
"build": "node scripts/build.js",
|
||||
"test": "node scripts/test.js"
|
||||
},
|
||||
"eslintConfig": {
|
||||
"extends": [
|
||||
"react-app",
|
||||
"react-app/jest"
|
||||
]
|
||||
},
|
||||
"browserslist": {
|
||||
"production": [
|
||||
">0.2%",
|
||||
"not dead",
|
||||
"not op_mini all"
|
||||
],
|
||||
"development": [
|
||||
"last 1 chrome version",
|
||||
"last 1 firefox version",
|
||||
"last 1 safari version"
|
||||
]
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/three": "^0.150.1"
|
||||
},
|
||||
"jest": {
|
||||
"roots": [
|
||||
"<rootDir>/src"
|
||||
],
|
||||
"collectCoverageFrom": [
|
||||
"src/**/*.{js,jsx,ts,tsx}",
|
||||
"!src/**/*.d.ts"
|
||||
],
|
||||
"setupFiles": [
|
||||
"react-app-polyfill/jsdom"
|
||||
],
|
||||
"setupFilesAfterEnv": [
|
||||
"<rootDir>/src/setupTests.ts"
|
||||
],
|
||||
"testMatch": [
|
||||
"<rootDir>/src/**/__tests__/**/*.{js,jsx,ts,tsx}",
|
||||
"<rootDir>/src/**/*.{spec,test}.{js,jsx,ts,tsx}"
|
||||
],
|
||||
"testEnvironment": "jsdom",
|
||||
"transform": {
|
||||
"^.+\\.(js|jsx|mjs|cjs|ts|tsx)$": "<rootDir>/config/jest/babelTransform.js",
|
||||
"^.+\\.css$": "<rootDir>/config/jest/cssTransform.js",
|
||||
"^(?!.*\\.(js|jsx|mjs|cjs|ts|tsx|css|json)$)": "<rootDir>/config/jest/fileTransform.js"
|
||||
},
|
||||
"transformIgnorePatterns": [
|
||||
"[/\\\\]node_modules[/\\\\].+\\.(js|jsx|mjs|cjs|ts|tsx)$",
|
||||
"^.+\\.module\\.(css|sass|scss)$"
|
||||
],
|
||||
"modulePaths": [],
|
||||
"moduleNameMapper": {
|
||||
"^react-native$": "react-native-web",
|
||||
"^.+\\.module\\.(css|sass|scss)$": "identity-obj-proxy"
|
||||
},
|
||||
"moduleFileExtensions": [
|
||||
"web.js",
|
||||
"js",
|
||||
"web.ts",
|
||||
"ts",
|
||||
"web.tsx",
|
||||
"tsx",
|
||||
"json",
|
||||
"web.jsx",
|
||||
"jsx",
|
||||
"node"
|
||||
],
|
||||
"watchPlugins": [
|
||||
"jest-watch-typeahead/filename",
|
||||
"jest-watch-typeahead/testname"
|
||||
],
|
||||
"resetMocks": true
|
||||
},
|
||||
"babel": {
|
||||
"presets": [
|
||||
"react-app"
|
||||
]
|
||||
}
|
||||
}
|
Binary file not shown.
Before Width: | Height: | Size: 3.8 KiB |
|
@ -1,43 +0,0 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8" />
|
||||
<link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||
<meta name="theme-color" content="#000000" />
|
||||
<meta
|
||||
name="description"
|
||||
content="Web site created using create-react-app"
|
||||
/>
|
||||
<link rel="apple-touch-icon" href="%PUBLIC_URL%/logo192.png" />
|
||||
<!--
|
||||
manifest.json provides metadata used when your web app is installed on a
|
||||
user's mobile device or desktop. See https://developers.google.com/web/fundamentals/web-app-manifest/
|
||||
-->
|
||||
<link rel="manifest" href="%PUBLIC_URL%/manifest.json" />
|
||||
<!--
|
||||
Notice the use of %PUBLIC_URL% in the tags above.
|
||||
It will be replaced with the URL of the `public` folder during the build.
|
||||
Only files inside the `public` folder can be referenced from the HTML.
|
||||
|
||||
Unlike "/favicon.ico" or "favicon.ico", "%PUBLIC_URL%/favicon.ico" will
|
||||
work correctly both with client-side routing and a non-root public URL.
|
||||
Learn how to configure a non-root public URL by running `npm run build`.
|
||||
-->
|
||||
<title>React App</title>
|
||||
</head>
|
||||
<body>
|
||||
<noscript>You need to enable JavaScript to run this app.</noscript>
|
||||
<div class="root" id="root"></div>
|
||||
<!--
|
||||
This HTML file is a template.
|
||||
If you open it directly in the browser, you will see an empty page.
|
||||
|
||||
You can add webfonts, meta tags, or analytics to this file.
|
||||
The build step will place the bundled scripts into the <body> tag.
|
||||
|
||||
To begin the development, run `npm start` or `yarn start`.
|
||||
To create a production bundle, use `npm run build` or `yarn build`.
|
||||
-->
|
||||
</body>
|
||||
</html>
|
Binary file not shown.
Before Width: | Height: | Size: 5.2 KiB |
Binary file not shown.
Before Width: | Height: | Size: 9.4 KiB |
|
@ -1,25 +0,0 @@
|
|||
{
|
||||
"short_name": "React App",
|
||||
"name": "Create React App Sample",
|
||||
"icons": [
|
||||
{
|
||||
"src": "favicon.ico",
|
||||
"sizes": "64x64 32x32 24x24 16x16",
|
||||
"type": "image/x-icon"
|
||||
},
|
||||
{
|
||||
"src": "logo192.png",
|
||||
"type": "image/png",
|
||||
"sizes": "192x192"
|
||||
},
|
||||
{
|
||||
"src": "logo512.png",
|
||||
"type": "image/png",
|
||||
"sizes": "512x512"
|
||||
}
|
||||
],
|
||||
"start_url": ".",
|
||||
"display": "standalone",
|
||||
"theme_color": "#000000",
|
||||
"background_color": "#ffffff"
|
||||
}
|
|
@ -1,3 +0,0 @@
|
|||
# https://www.robotstxt.org/robotstxt.html
|
||||
User-agent: *
|
||||
Disallow:
|
|
@ -1,217 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
// Do this as the first thing so that any code reading it knows the right env.
|
||||
process.env.BABEL_ENV = 'production';
|
||||
process.env.NODE_ENV = 'production';
|
||||
|
||||
// Makes the script crash on unhandled rejections instead of silently
|
||||
// ignoring them. In the future, promise rejections that are not handled will
|
||||
// terminate the Node.js process with a non-zero exit code.
|
||||
process.on('unhandledRejection', err => {
|
||||
throw err;
|
||||
});
|
||||
|
||||
// Ensure environment variables are read.
|
||||
require('../config/env');
|
||||
|
||||
const path = require('path');
|
||||
const chalk = require('react-dev-utils/chalk');
|
||||
const fs = require('fs-extra');
|
||||
const bfj = require('bfj');
|
||||
const webpack = require('webpack');
|
||||
const configFactory = require('../config/webpack.config');
|
||||
const paths = require('../config/paths');
|
||||
const checkRequiredFiles = require('react-dev-utils/checkRequiredFiles');
|
||||
const formatWebpackMessages = require('react-dev-utils/formatWebpackMessages');
|
||||
const printHostingInstructions = require('react-dev-utils/printHostingInstructions');
|
||||
const FileSizeReporter = require('react-dev-utils/FileSizeReporter');
|
||||
const printBuildError = require('react-dev-utils/printBuildError');
|
||||
|
||||
const measureFileSizesBeforeBuild =
|
||||
FileSizeReporter.measureFileSizesBeforeBuild;
|
||||
const printFileSizesAfterBuild = FileSizeReporter.printFileSizesAfterBuild;
|
||||
const useYarn = fs.existsSync(paths.yarnLockFile);
|
||||
|
||||
// These sizes are pretty large. We'll warn for bundles exceeding them.
|
||||
const WARN_AFTER_BUNDLE_GZIP_SIZE = 512 * 1024;
|
||||
const WARN_AFTER_CHUNK_GZIP_SIZE = 1024 * 1024;
|
||||
|
||||
const isInteractive = process.stdout.isTTY;
|
||||
|
||||
// Warn and crash if required files are missing
|
||||
if (!checkRequiredFiles([paths.appHtml, paths.appIndexJs])) {
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const argv = process.argv.slice(2);
|
||||
const writeStatsJson = argv.indexOf('--stats') !== -1;
|
||||
|
||||
// Generate configuration
|
||||
const config = configFactory('production');
|
||||
|
||||
// We require that you explicitly set browsers and do not fall back to
|
||||
// browserslist defaults.
|
||||
const { checkBrowsers } = require('react-dev-utils/browsersHelper');
|
||||
checkBrowsers(paths.appPath, isInteractive)
|
||||
.then(() => {
|
||||
// First, read the current file sizes in build directory.
|
||||
// This lets us display how much they changed later.
|
||||
return measureFileSizesBeforeBuild(paths.appBuild);
|
||||
})
|
||||
.then(previousFileSizes => {
|
||||
// Remove all content but keep the directory so that
|
||||
// if you're in it, you don't end up in Trash
|
||||
fs.emptyDirSync(paths.appBuild);
|
||||
// Merge with the public folder
|
||||
copyPublicFolder();
|
||||
// Start the webpack build
|
||||
return build(previousFileSizes);
|
||||
})
|
||||
.then(
|
||||
({ stats, previousFileSizes, warnings }) => {
|
||||
if (warnings.length) {
|
||||
console.log(chalk.yellow('Compiled with warnings.\n'));
|
||||
console.log(warnings.join('\n\n'));
|
||||
console.log(
|
||||
'\nSearch for the ' +
|
||||
chalk.underline(chalk.yellow('keywords')) +
|
||||
' to learn more about each warning.'
|
||||
);
|
||||
console.log(
|
||||
'To ignore, add ' +
|
||||
chalk.cyan('// eslint-disable-next-line') +
|
||||
' to the line before.\n'
|
||||
);
|
||||
} else {
|
||||
console.log(chalk.green('Compiled successfully.\n'));
|
||||
}
|
||||
|
||||
console.log('File sizes after gzip:\n');
|
||||
printFileSizesAfterBuild(
|
||||
stats,
|
||||
previousFileSizes,
|
||||
paths.appBuild,
|
||||
WARN_AFTER_BUNDLE_GZIP_SIZE,
|
||||
WARN_AFTER_CHUNK_GZIP_SIZE
|
||||
);
|
||||
console.log();
|
||||
|
||||
const appPackage = require(paths.appPackageJson);
|
||||
const publicUrl = paths.publicUrlOrPath;
|
||||
const publicPath = config.output.publicPath;
|
||||
const buildFolder = path.relative(process.cwd(), paths.appBuild);
|
||||
printHostingInstructions(
|
||||
appPackage,
|
||||
publicUrl,
|
||||
publicPath,
|
||||
buildFolder,
|
||||
useYarn
|
||||
);
|
||||
},
|
||||
err => {
|
||||
const tscCompileOnError = process.env.TSC_COMPILE_ON_ERROR === 'true';
|
||||
if (tscCompileOnError) {
|
||||
console.log(
|
||||
chalk.yellow(
|
||||
'Compiled with the following type errors (you may want to check these before deploying your app):\n'
|
||||
)
|
||||
);
|
||||
printBuildError(err);
|
||||
} else {
|
||||
console.log(chalk.red('Failed to compile.\n'));
|
||||
printBuildError(err);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
)
|
||||
.catch(err => {
|
||||
if (err && err.message) {
|
||||
console.log(err.message);
|
||||
}
|
||||
process.exit(1);
|
||||
});
|
||||
|
||||
// Create the production build and print the deployment instructions.
|
||||
function build(previousFileSizes) {
|
||||
console.log('Creating an optimized production build...');
|
||||
|
||||
const compiler = webpack(config);
|
||||
return new Promise((resolve, reject) => {
|
||||
compiler.run((err, stats) => {
|
||||
let messages;
|
||||
if (err) {
|
||||
if (!err.message) {
|
||||
return reject(err);
|
||||
}
|
||||
|
||||
let errMessage = err.message;
|
||||
|
||||
// Add additional information for postcss errors
|
||||
if (Object.prototype.hasOwnProperty.call(err, 'postcssNode')) {
|
||||
errMessage +=
|
||||
'\nCompileError: Begins at CSS selector ' +
|
||||
err['postcssNode'].selector;
|
||||
}
|
||||
|
||||
messages = formatWebpackMessages({
|
||||
errors: [errMessage],
|
||||
warnings: [],
|
||||
});
|
||||
} else {
|
||||
messages = formatWebpackMessages(
|
||||
stats.toJson({ all: false, warnings: true, errors: true })
|
||||
);
|
||||
}
|
||||
if (messages.errors.length) {
|
||||
// Only keep the first error. Others are often indicative
|
||||
// of the same problem, but confuse the reader with noise.
|
||||
if (messages.errors.length > 1) {
|
||||
messages.errors.length = 1;
|
||||
}
|
||||
return reject(new Error(messages.errors.join('\n\n')));
|
||||
}
|
||||
if (
|
||||
process.env.CI &&
|
||||
(typeof process.env.CI !== 'string' ||
|
||||
process.env.CI.toLowerCase() !== 'false') &&
|
||||
messages.warnings.length
|
||||
) {
|
||||
// Ignore sourcemap warnings in CI builds. See #8227 for more info.
|
||||
const filteredWarnings = messages.warnings.filter(
|
||||
w => !/Failed to parse source map/.test(w)
|
||||
);
|
||||
if (filteredWarnings.length) {
|
||||
console.log(
|
||||
chalk.yellow(
|
||||
'\nTreating warnings as errors because process.env.CI = true.\n' +
|
||||
'Most CI servers set it automatically.\n'
|
||||
)
|
||||
);
|
||||
return reject(new Error(filteredWarnings.join('\n\n')));
|
||||
}
|
||||
}
|
||||
|
||||
const resolveArgs = {
|
||||
stats,
|
||||
previousFileSizes,
|
||||
warnings: messages.warnings,
|
||||
};
|
||||
|
||||
if (writeStatsJson) {
|
||||
return bfj
|
||||
.write(paths.appBuild + '/bundle-stats.json', stats.toJson())
|
||||
.then(() => resolve(resolveArgs))
|
||||
.catch(error => reject(new Error(error)));
|
||||
}
|
||||
|
||||
return resolve(resolveArgs);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
function copyPublicFolder() {
|
||||
fs.copySync(paths.appPublic, paths.appBuild, {
|
||||
dereference: true,
|
||||
filter: file => file !== paths.appHtml,
|
||||
});
|
||||
}
|
|
@ -1,154 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
// Do this as the first thing so that any code reading it knows the right env.
|
||||
process.env.BABEL_ENV = 'development';
|
||||
process.env.NODE_ENV = 'development';
|
||||
|
||||
// Makes the script crash on unhandled rejections instead of silently
|
||||
// ignoring them. In the future, promise rejections that are not handled will
|
||||
// terminate the Node.js process with a non-zero exit code.
|
||||
process.on('unhandledRejection', err => {
|
||||
throw err;
|
||||
});
|
||||
|
||||
// Ensure environment variables are read.
|
||||
require('../config/env');
|
||||
|
||||
const fs = require('fs');
|
||||
const chalk = require('react-dev-utils/chalk');
|
||||
const webpack = require('webpack');
|
||||
const WebpackDevServer = require('webpack-dev-server');
|
||||
const clearConsole = require('react-dev-utils/clearConsole');
|
||||
const checkRequiredFiles = require('react-dev-utils/checkRequiredFiles');
|
||||
const {
|
||||
choosePort,
|
||||
createCompiler,
|
||||
prepareProxy,
|
||||
prepareUrls,
|
||||
} = require('react-dev-utils/WebpackDevServerUtils');
|
||||
const openBrowser = require('react-dev-utils/openBrowser');
|
||||
const semver = require('semver');
|
||||
const paths = require('../config/paths');
|
||||
const configFactory = require('../config/webpack.config');
|
||||
const createDevServerConfig = require('../config/webpackDevServer.config');
|
||||
const getClientEnvironment = require('../config/env');
|
||||
const react = require(require.resolve('react', { paths: [paths.appPath] }));
|
||||
|
||||
const env = getClientEnvironment(paths.publicUrlOrPath.slice(0, -1));
|
||||
const useYarn = fs.existsSync(paths.yarnLockFile);
|
||||
const isInteractive = process.stdout.isTTY;
|
||||
|
||||
// Warn and crash if required files are missing
|
||||
if (!checkRequiredFiles([paths.appHtml, paths.appIndexJs])) {
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Tools like Cloud9 rely on this.
|
||||
const DEFAULT_PORT = parseInt(process.env.PORT, 10) || 3000;
|
||||
const HOST = process.env.HOST || '0.0.0.0';
|
||||
|
||||
if (process.env.HOST) {
|
||||
console.log(
|
||||
chalk.cyan(
|
||||
`Attempting to bind to HOST environment variable: ${chalk.yellow(
|
||||
chalk.bold(process.env.HOST)
|
||||
)}`
|
||||
)
|
||||
);
|
||||
console.log(
|
||||
`If this was unintentional, check that you haven't mistakenly set it in your shell.`
|
||||
);
|
||||
console.log(
|
||||
`Learn more here: ${chalk.yellow('https://cra.link/advanced-config')}`
|
||||
);
|
||||
console.log();
|
||||
}
|
||||
|
||||
// We require that you explicitly set browsers and do not fall back to
|
||||
// browserslist defaults.
|
||||
const { checkBrowsers } = require('react-dev-utils/browsersHelper');
|
||||
checkBrowsers(paths.appPath, isInteractive)
|
||||
.then(() => {
|
||||
// We attempt to use the default port but if it is busy, we offer the user to
|
||||
// run on a different port. `choosePort()` Promise resolves to the next free port.
|
||||
return choosePort(HOST, DEFAULT_PORT);
|
||||
})
|
||||
.then(port => {
|
||||
if (port == null) {
|
||||
// We have not found a port.
|
||||
return;
|
||||
}
|
||||
|
||||
const config = configFactory('development');
|
||||
const protocol = process.env.HTTPS === 'true' ? 'https' : 'http';
|
||||
const appName = require(paths.appPackageJson).name;
|
||||
|
||||
const useTypeScript = fs.existsSync(paths.appTsConfig);
|
||||
const urls = prepareUrls(
|
||||
protocol,
|
||||
HOST,
|
||||
port,
|
||||
paths.publicUrlOrPath.slice(0, -1)
|
||||
);
|
||||
// Create a webpack compiler that is configured with custom messages.
|
||||
const compiler = createCompiler({
|
||||
appName,
|
||||
config,
|
||||
urls,
|
||||
useYarn,
|
||||
useTypeScript,
|
||||
webpack,
|
||||
});
|
||||
// Load proxy config
|
||||
const proxySetting = require(paths.appPackageJson).proxy;
|
||||
const proxyConfig = prepareProxy(
|
||||
proxySetting,
|
||||
paths.appPublic,
|
||||
paths.publicUrlOrPath
|
||||
);
|
||||
// Serve webpack assets generated by the compiler over a web server.
|
||||
const serverConfig = {
|
||||
...createDevServerConfig(proxyConfig, urls.lanUrlForConfig),
|
||||
host: HOST,
|
||||
port,
|
||||
};
|
||||
const devServer = new WebpackDevServer(serverConfig, compiler);
|
||||
// Launch WebpackDevServer.
|
||||
devServer.startCallback(() => {
|
||||
if (isInteractive) {
|
||||
clearConsole();
|
||||
}
|
||||
|
||||
if (env.raw.FAST_REFRESH && semver.lt(react.version, '16.10.0')) {
|
||||
console.log(
|
||||
chalk.yellow(
|
||||
`Fast Refresh requires React 16.10 or higher. You are using React ${react.version}.`
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
console.log(chalk.cyan('Starting the development server...\n'));
|
||||
openBrowser(urls.localUrlForBrowser);
|
||||
});
|
||||
|
||||
['SIGINT', 'SIGTERM'].forEach(function (sig) {
|
||||
process.on(sig, function () {
|
||||
devServer.close();
|
||||
process.exit();
|
||||
});
|
||||
});
|
||||
|
||||
if (process.env.CI !== 'true') {
|
||||
// Gracefully exit when stdin ends
|
||||
process.stdin.on('end', function () {
|
||||
devServer.close();
|
||||
process.exit();
|
||||
});
|
||||
}
|
||||
})
|
||||
.catch(err => {
|
||||
if (err && err.message) {
|
||||
console.log(err.message);
|
||||
}
|
||||
process.exit(1);
|
||||
});
|
|
@ -1,52 +0,0 @@
|
|||
'use strict';
|
||||
|
||||
// Do this as the first thing so that any code reading it knows the right env.
|
||||
process.env.BABEL_ENV = 'test';
|
||||
process.env.NODE_ENV = 'test';
|
||||
process.env.PUBLIC_URL = '';
|
||||
|
||||
// Makes the script crash on unhandled rejections instead of silently
|
||||
// ignoring them. In the future, promise rejections that are not handled will
|
||||
// terminate the Node.js process with a non-zero exit code.
|
||||
process.on('unhandledRejection', err => {
|
||||
throw err;
|
||||
});
|
||||
|
||||
// Ensure environment variables are read.
|
||||
require('../config/env');
|
||||
|
||||
const jest = require('jest');
|
||||
const execSync = require('child_process').execSync;
|
||||
let argv = process.argv.slice(2);
|
||||
|
||||
function isInGitRepository() {
|
||||
try {
|
||||
execSync('git rev-parse --is-inside-work-tree', { stdio: 'ignore' });
|
||||
return true;
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
function isInMercurialRepository() {
|
||||
try {
|
||||
execSync('hg --cwd . root', { stdio: 'ignore' });
|
||||
return true;
|
||||
} catch (e) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Watch unless on CI or explicitly running all tests
|
||||
if (
|
||||
!process.env.CI &&
|
||||
argv.indexOf('--watchAll') === -1 &&
|
||||
argv.indexOf('--watchAll=false') === -1
|
||||
) {
|
||||
// https://github.com/facebook/create-react-app/issues/5210
|
||||
const hasSourceControl = isInGitRepository() || isInMercurialRepository();
|
||||
argv.push(hasSourceControl ? '--watch' : '--watchAll');
|
||||
}
|
||||
|
||||
|
||||
jest.run(argv);
|
|
@ -1,44 +0,0 @@
|
|||
.canvas{
|
||||
width: 100vw;
|
||||
height: 100vh;
|
||||
display: block;
|
||||
}
|
||||
|
||||
|
||||
.root{
|
||||
overflow-y: hidden;
|
||||
}
|
||||
.centeredDiv{
|
||||
width: 100vw;
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
}
|
||||
.projects-container{
|
||||
width: 100%;
|
||||
background-color: aliceblue;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: space-evenly;
|
||||
align-items: center;
|
||||
overflow-y:hidden;
|
||||
}
|
||||
.centeredContainer{
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
label {
|
||||
background-color: indigo;
|
||||
color: white;
|
||||
padding: 0.5rem;
|
||||
font-family: sans-serif;
|
||||
border-radius: 0.3rem;
|
||||
cursor: pointer;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
#file-chosen{
|
||||
margin-left: 0.3rem;
|
||||
font-family: sans-serif;
|
||||
}
|
|
@ -1,10 +0,0 @@
|
|||
// @ts-nocheck
|
||||
import {ReactComponent as SolidSvg} from "./assets/solid.svg";
|
||||
import {ReactComponent as PartSvg} from "./assets/part.svg";
|
||||
|
||||
|
||||
export const svg = {SolidSvg, PartSvg}
|
||||
|
||||
|
||||
|
||||
export { svg as SVG };
|
|
@ -1,35 +0,0 @@
|
|||
export enum HttpMethod {
|
||||
GET = 'GET',
|
||||
POST = 'POST'
|
||||
}
|
||||
export enum HttpRoute {
|
||||
insertionPath = '/assembly/preview/insertion_sequence/',
|
||||
assemblyPreviewPath = '/assembly/preview/subsequence/',
|
||||
projects = '/assembly/preview',
|
||||
createProject = '/assembly/create',
|
||||
ajaxMatrix = 'matrix.json'
|
||||
}
|
||||
export class HttpRepository {
|
||||
static server = 'http://localhost:3002'
|
||||
static async jsonRequest<T>(method: HttpMethod, url: string, data?: any): Promise<T> {
|
||||
const reqInit = {
|
||||
'body': data,
|
||||
'method': method,
|
||||
'headers': { 'Content-Type': 'application/json' },
|
||||
}
|
||||
if (data !== undefined) {
|
||||
reqInit['body'] = JSON.stringify(data)
|
||||
}
|
||||
return (await fetch(this.server + url, reqInit)).json()
|
||||
}
|
||||
static async request<T>(method: HttpMethod, url: string, data?: any): Promise<T> {
|
||||
const reqInit = {
|
||||
'body': data,
|
||||
'method': method,
|
||||
}
|
||||
if (data !== undefined) {
|
||||
reqInit['body'] = data
|
||||
}
|
||||
return (await fetch(this.server + url, reqInit)).json()
|
||||
}
|
||||
}
|
|
@ -1 +0,0 @@
|
|||
export {}
|
|
@ -1,89 +0,0 @@
|
|||
import * as React from "react";
|
||||
import { useEffect, useState } from "react";
|
||||
import {
|
||||
HttpMethod,
|
||||
HttpRepository,
|
||||
HttpRoute,
|
||||
} from "../../core/repository/http_repository";
|
||||
import { Button } from "antd";
|
||||
import { Typography } from "antd";
|
||||
import { Card } from "antd";
|
||||
import { createProjectRoute } from "../create_project/create_project";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { pathAjaxTopologyScreen } from "../topology_ajax_preview/topology_ajax_preview";
|
||||
import { pathStabilityScreen } from "../stability_preview/stability_preview";
|
||||
|
||||
const { Text, Link, Title } = Typography;
|
||||
function LinkCreateProjectPage() {
|
||||
const navigate = useNavigate();
|
||||
|
||||
return (
|
||||
<Link
|
||||
style={{ paddingLeft: "10px" }}
|
||||
onClick={() => {
|
||||
navigate(createProjectRoute);
|
||||
}}
|
||||
>
|
||||
<> add new project?</>
|
||||
</Link>
|
||||
);
|
||||
}
|
||||
|
||||
export const ProjectsPath = "/";
|
||||
export const ProjectScreen: React.FunctionComponent = () => {
|
||||
const [projects, setProjects] = useState<Array<String>>([]);
|
||||
const navigate = useNavigate();
|
||||
|
||||
useEffect(() => {
|
||||
async function fetchData() {
|
||||
setProjects(
|
||||
await HttpRepository.jsonRequest<Array<String>>(
|
||||
HttpMethod.GET,
|
||||
HttpRoute.projects
|
||||
)
|
||||
);
|
||||
}
|
||||
fetchData();
|
||||
}, []);
|
||||
return (
|
||||
<>
|
||||
<div className="centeredDiv">
|
||||
<Title>Projects</Title>
|
||||
</div>
|
||||
<div>
|
||||
{projects.length === 0 ? (
|
||||
<div className="centeredDiv">
|
||||
<Text>Not found projects</Text>
|
||||
|
||||
<div>
|
||||
<LinkCreateProjectPage />
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div></div>
|
||||
)}
|
||||
</div>
|
||||
<div className="projects-container">
|
||||
{projects.map((el) => {
|
||||
return (
|
||||
<>
|
||||
<Card style={{ width: 300 }}>
|
||||
<div>{el}</div>
|
||||
<Button onClick={() => {
|
||||
navigate(pathAjaxTopologyScreen + el);
|
||||
}} > Preview topology ajax computed </Button>
|
||||
<Button onClick={() => {
|
||||
navigate(pathStabilityScreen + el);
|
||||
}} > Preview stability computed </Button>
|
||||
|
||||
<Button> Preview insert Path </Button>
|
||||
<Button>Preview assembly logical </Button>
|
||||
</Card>
|
||||
</>
|
||||
);
|
||||
})}
|
||||
<div> {projects.length === 0 ? <></> : <LinkCreateProjectPage />} </div>
|
||||
</div>
|
||||
</>
|
||||
);
|
||||
};
|
|
@ -1,198 +0,0 @@
|
|||
import * as React from "react";
|
||||
import {
|
||||
DirectionalLight,
|
||||
Object3D,
|
||||
PerspectiveCamera,
|
||||
Scene,
|
||||
WebGLRenderer,
|
||||
AmbientLight,
|
||||
Vector3,
|
||||
Group,
|
||||
Quaternion,
|
||||
} from "three";
|
||||
import { OrbitControls } from "three/examples/jsm/controls/OrbitControls";
|
||||
import { OBJLoader } from "three/examples/jsm/loaders/OBJLoader";
|
||||
import CSS from "csstype";
|
||||
import {
|
||||
HttpMethod,
|
||||
HttpRepository,
|
||||
HttpRoute,
|
||||
} from "../../core/repository/http_repository";
|
||||
import { useParams } from "react-router-dom";
|
||||
|
||||
|
||||
const canvasStyle: CSS.Properties = {
|
||||
backgroundColor: "rgb(151 41 41 / 85%)",
|
||||
};
|
||||
|
||||
export const AssemblyPreviewInsertVectorPath = "/insertion_vector/";
|
||||
|
||||
export interface AssemblyPreviewInsertionPathModel {
|
||||
offset: number;
|
||||
count: number;
|
||||
parent: string;
|
||||
child: string;
|
||||
insertions: Insertions;
|
||||
}
|
||||
|
||||
export interface Insertions {
|
||||
time: number;
|
||||
insertion_path: InsertionPath[];
|
||||
status: string;
|
||||
}
|
||||
|
||||
export interface InsertionPath {
|
||||
quadrelion: number[];
|
||||
xyz: number[];
|
||||
euler: number[];
|
||||
}
|
||||
|
||||
export function AssemblyPreviewInsertVector() {
|
||||
const container = new Object3D();
|
||||
const canvasRef = React.useRef<HTMLCanvasElement>(null);
|
||||
const scene = new Scene();
|
||||
const camera = new PerspectiveCamera(
|
||||
80,
|
||||
window.innerWidth / window.innerHeight,
|
||||
0.1,
|
||||
1000
|
||||
);
|
||||
let renderId = 1;
|
||||
let assemblyCounter: undefined | Number = undefined;
|
||||
let params = useParams().id;
|
||||
|
||||
React.useEffect(() => {
|
||||
const renderer = new WebGLRenderer({
|
||||
canvas: canvasRef.current as HTMLCanvasElement,
|
||||
antialias: true,
|
||||
alpha: true,
|
||||
});
|
||||
|
||||
camera.position.set(2, 1, 2);
|
||||
|
||||
const directionalLight = new DirectionalLight(0xffffff, 0.2);
|
||||
directionalLight.castShadow = true;
|
||||
directionalLight.position.set(-1, 2, 4);
|
||||
scene.add(directionalLight);
|
||||
|
||||
const ambientLight = new AmbientLight(0xffffff, 0.7);
|
||||
scene.add(ambientLight);
|
||||
container.position.set(0, 0, 0);
|
||||
|
||||
renderer.setSize(window.innerWidth, window.innerHeight);
|
||||
|
||||
const onResize = () => {
|
||||
camera.aspect = window.innerWidth / window.innerHeight;
|
||||
camera.updateProjectionMatrix();
|
||||
renderer!.setSize(window.innerWidth, window.innerHeight);
|
||||
};
|
||||
|
||||
window.addEventListener("resize", onResize, false);
|
||||
new OrbitControls(camera, renderer.domElement);
|
||||
|
||||
renderer!.setAnimationLoop(() => {
|
||||
renderer!.render(scene, camera);
|
||||
});
|
||||
|
||||
|
||||
renderObject(1, params!);
|
||||
});
|
||||
|
||||
async function renderObject(renderId: Number, projectId: String) {
|
||||
const assemblyResponse =
|
||||
await HttpRepository.jsonRequest<AssemblyPreviewInsertionPathModel>(
|
||||
HttpMethod.GET,
|
||||
`${HttpRoute.insertionPath}${projectId}?count=${renderId}`
|
||||
);
|
||||
const objectControl = (
|
||||
await loadObject([assemblyResponse.child, assemblyResponse.parent])
|
||||
)[1];
|
||||
|
||||
|
||||
function assemblyAnimate(objectId: Number, coords: InsertionPath, b:boolean) {
|
||||
const object = scene.getObjectById(objectId as number);
|
||||
const r = 1
|
||||
object?.position.set(coords.xyz[0] * r, coords.xyz[1] * r, coords.xyz[2] * r);
|
||||
object?.setRotationFromQuaternion(
|
||||
new Quaternion(
|
||||
coords.quadrelion[0],
|
||||
coords.quadrelion[1],
|
||||
coords.quadrelion[2],
|
||||
coords.quadrelion[3]
|
||||
)
|
||||
);
|
||||
console.log(object?.position)
|
||||
}
|
||||
function timer(ms: number) {
|
||||
return new Promise((res) => setTimeout(res, ms));
|
||||
}
|
||||
const b = true
|
||||
async function load(id: Number, len: number) {
|
||||
for (var i = 0; i < len; i++) {
|
||||
|
||||
assemblyAnimate(objectControl, assemblyResponse.insertions.insertion_path[i], b);
|
||||
await timer(3);
|
||||
}
|
||||
}
|
||||
|
||||
assemblyResponse.insertions.insertion_path = assemblyResponse.insertions.insertion_path.reverse()
|
||||
load(objectControl, assemblyResponse.insertions.insertion_path.length);
|
||||
}
|
||||
|
||||
async function click() {
|
||||
renderId = renderId + 1;
|
||||
|
||||
if (assemblyCounter === renderId) {
|
||||
renderId = 1;
|
||||
}
|
||||
scene.clear();
|
||||
renderObject(renderId, params!);
|
||||
}
|
||||
|
||||
async function loadObject(objectList: string[]): Promise<Number[]> {
|
||||
const promises: Array<Promise<Group>> = [];
|
||||
objectList.forEach((e) => {
|
||||
const fbxLoader = new OBJLoader();
|
||||
promises.push(fbxLoader.loadAsync(e));
|
||||
});
|
||||
|
||||
const objects = await Promise.all(promises);
|
||||
const result: Array<Number> = [];
|
||||
for (let i = 0; objects.length > i; i++) {
|
||||
const el = objects[i];
|
||||
container.add(el);
|
||||
scene.add(container);
|
||||
result.push(el.id);
|
||||
const directionalLight = new DirectionalLight(0xffffff, 0.2);
|
||||
directionalLight.castShadow = true;
|
||||
directionalLight.position.set(container.position.x - 10,container.position.y - 10,container.position.z - 10);
|
||||
scene.add(directionalLight);
|
||||
container.position.set(0, 0, 0);
|
||||
fitCameraToCenteredObject(camera, container);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
function fitCameraToCenteredObject(
|
||||
camera: PerspectiveCamera,
|
||||
object: Object3D
|
||||
) {
|
||||
const dist = 20;
|
||||
const vector = new Vector3();
|
||||
|
||||
camera.getWorldDirection(vector);
|
||||
|
||||
vector.multiplyScalar(dist);
|
||||
vector.add(camera.position);
|
||||
|
||||
object.position.set(vector.x, vector.y, vector.z);
|
||||
object.setRotationFromQuaternion(camera.quaternion);
|
||||
}
|
||||
return (
|
||||
<>
|
||||
<div className="loader">
|
||||
<div onClick={() => click()}>next</div>
|
||||
<canvas style={canvasStyle} ref={canvasRef} />
|
||||
</div>
|
||||
</>
|
||||
);
|
||||
}
|
|
@ -1,141 +0,0 @@
|
|||
import React, { useEffect } from "react";
|
||||
import {
|
||||
DirectionalLight,
|
||||
Object3D,
|
||||
PerspectiveCamera,
|
||||
Scene,
|
||||
WebGLRenderer,
|
||||
AmbientLight,
|
||||
Vector3,
|
||||
} from "three";
|
||||
import { OrbitControls } from "three/examples/jsm/controls/OrbitControls";
|
||||
import { OBJLoader } from "three/examples/jsm/loaders/OBJLoader";
|
||||
import CSS from "csstype";
|
||||
|
||||
import { useParams } from "react-router-dom";
|
||||
import { HttpMethod, HttpRepository, HttpRoute } from "../../core/repository/http_repository";
|
||||
|
||||
const canvasStyle: CSS.Properties = {
|
||||
backgroundColor: "rgb(151 41 41 / 85%)",
|
||||
};
|
||||
|
||||
export interface AssemblyPreviewStructure {
|
||||
assembly: string[];
|
||||
offset: number;
|
||||
count: number;
|
||||
}
|
||||
|
||||
export const AssemblyPreviewSubsequencePath = "/123/";
|
||||
|
||||
export const AssemblyPreviewSubsequence = () => {
|
||||
const container = new Object3D();
|
||||
const canvasRef = React.useRef<HTMLCanvasElement>(null);
|
||||
const scene = new Scene();
|
||||
const camera = new PerspectiveCamera(
|
||||
80,
|
||||
window.innerWidth / window.innerHeight,
|
||||
0.1,
|
||||
1000
|
||||
);
|
||||
let renderId = 1;
|
||||
let assemblyCounter: undefined | Number = undefined;
|
||||
let params = useParams().id;
|
||||
|
||||
useEffect(() => {
|
||||
const renderer = new WebGLRenderer({
|
||||
canvas: canvasRef.current as HTMLCanvasElement,
|
||||
antialias: true,
|
||||
alpha: true,
|
||||
});
|
||||
|
||||
camera.position.set(2, 1, 2);
|
||||
|
||||
const directionalLight = new DirectionalLight(0xffffff, 0.2);
|
||||
directionalLight.castShadow = true;
|
||||
directionalLight.position.set(-1, 2, 4);
|
||||
scene.add(directionalLight);
|
||||
|
||||
const ambientLight = new AmbientLight(0xffffff, 0.7);
|
||||
scene.add(ambientLight);
|
||||
container.position.set(0, 0, 0);
|
||||
|
||||
renderer.setSize(window.innerWidth, window.innerHeight);
|
||||
|
||||
const onResize = () => {
|
||||
camera.aspect = window.innerWidth / window.innerHeight;
|
||||
camera.updateProjectionMatrix();
|
||||
renderer!.setSize(window.innerWidth, window.innerHeight);
|
||||
};
|
||||
|
||||
window.addEventListener("resize", onResize, false);
|
||||
new OrbitControls(camera, renderer.domElement);
|
||||
|
||||
renderer!.setAnimationLoop(() => {
|
||||
renderer!.render(scene, camera);
|
||||
});
|
||||
renderObject(1, params!);
|
||||
} );
|
||||
|
||||
async function renderObject(renderId: Number,projectId:string ) {
|
||||
const assemblyResponse =
|
||||
await HttpRepository.jsonRequest<AssemblyPreviewStructure>(
|
||||
HttpMethod.GET,
|
||||
`${HttpRoute.assemblyPreviewPath}${projectId}?count=${renderId}`
|
||||
);
|
||||
assemblyCounter = assemblyResponse.count;
|
||||
|
||||
loadObject(assemblyResponse.assembly);
|
||||
}
|
||||
|
||||
async function click() {
|
||||
renderId = renderId + 1;
|
||||
console.log(assemblyCounter);
|
||||
console.log(renderId);
|
||||
if (assemblyCounter === renderId) {
|
||||
renderId = 1;
|
||||
}
|
||||
renderObject(renderId, params!);
|
||||
}
|
||||
|
||||
function loadObject(objectList: string[]) {
|
||||
objectList.forEach((el) => {
|
||||
const fbxLoader = new OBJLoader();
|
||||
fbxLoader.load(
|
||||
el,
|
||||
(object) => {
|
||||
object.scale.x = 0.3;
|
||||
object.scale.y = 0.3;
|
||||
object.scale.z = 0.3;
|
||||
object.rotation.x = -Math.PI / 2;
|
||||
object.position.y = -30;
|
||||
container.add(object);
|
||||
scene.add(container);
|
||||
|
||||
fitCameraToCenteredObject(camera, container);
|
||||
},
|
||||
(xhr) => {
|
||||
console.log((xhr.loaded / xhr.total) * 100 + "% loaded");
|
||||
},
|
||||
(error) => {
|
||||
console.log(error);
|
||||
}
|
||||
);
|
||||
});
|
||||
}
|
||||
function fitCameraToCenteredObject(
|
||||
camera: PerspectiveCamera,
|
||||
object: Object3D
|
||||
) {
|
||||
const dist = 50;
|
||||
const vector = new Vector3();
|
||||
|
||||
camera.getWorldDirection(vector);
|
||||
|
||||
vector.multiplyScalar(dist);
|
||||
vector.add(camera.position);
|
||||
|
||||
object.position.set(vector.x, vector.y, vector.z);
|
||||
object.setRotationFromQuaternion(camera.quaternion);
|
||||
}
|
||||
return <canvas onClick={() => click()} style={canvasStyle} ref={canvasRef} />;
|
||||
};
|
|
@ -1,72 +0,0 @@
|
|||
import { Spin, Typography } from "antd";
|
||||
import * as React from "react";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import {
|
||||
HttpMethod,
|
||||
HttpRepository,
|
||||
HttpRoute,
|
||||
} from "../../core/repository/http_repository";
|
||||
import { pathStabilityScreen } from "../stability_preview/stability_preview";
|
||||
|
||||
const { Title } = Typography;
|
||||
|
||||
export const createProjectRoute = "/new_project";
|
||||
|
||||
const UploadButton = () => {
|
||||
const navigate = useNavigate();
|
||||
const [isLoading, setLoading] = React.useState<boolean>(false);
|
||||
|
||||
const handleImageChange = function (e: React.ChangeEvent<HTMLInputElement>) {
|
||||
const fileList = e.target.files;
|
||||
|
||||
if (!fileList) return;
|
||||
|
||||
let file = fileList[0] as File;
|
||||
uploadFile(file);
|
||||
};
|
||||
|
||||
const uploadFile = async (file: File) => {
|
||||
if (file) {
|
||||
const formData = new FormData();
|
||||
formData.append("freecad", file, file.name);
|
||||
setLoading(true);
|
||||
await HttpRepository.request(
|
||||
HttpMethod.POST,
|
||||
HttpRoute.createProject,
|
||||
formData
|
||||
);
|
||||
setLoading(false);
|
||||
navigate(pathStabilityScreen)
|
||||
}
|
||||
};
|
||||
return isLoading ? (
|
||||
<>
|
||||
<Spin />
|
||||
</>
|
||||
) : (
|
||||
<label htmlFor="photo">
|
||||
<input
|
||||
accept="*/.FCStd"
|
||||
style={{ display: "none" }}
|
||||
id="photo"
|
||||
name="photo"
|
||||
type="file"
|
||||
multiple={false}
|
||||
onChange={handleImageChange}
|
||||
/>
|
||||
Choose Cad file
|
||||
</label>
|
||||
);
|
||||
};
|
||||
export default function CreateProject() {
|
||||
return (
|
||||
<div className="centeredContainer">
|
||||
<div className="centeredDiv">
|
||||
<Title>Create new project</Title>
|
||||
</div>
|
||||
<div style={{ paddingTop: "10px" }}>
|
||||
<UploadButton />
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
|
@ -1,57 +0,0 @@
|
|||
|
||||
import { Button } from 'antd';
|
||||
import * as React from 'react';
|
||||
import { useParams } from 'react-router-dom';
|
||||
import { HttpRepository, HttpMethod, HttpRoute } from '../../core/repository/http_repository';
|
||||
|
||||
|
||||
export const pathStabilityScreen = '/stability/preview/usecase/'
|
||||
|
||||
interface IStabilityCheckResponce {
|
||||
status: "rejected" | "fulfilled";
|
||||
value: undefined | string;
|
||||
index: number;
|
||||
}
|
||||
interface IStability {
|
||||
status: boolean;
|
||||
detail: string;
|
||||
}
|
||||
|
||||
export const StabilityPreviewScreen: React.FunctionComponent = () => {
|
||||
const id = useParams().id
|
||||
const [stabilityResult, setStability] = React.useState<IStability[] | null>(null);
|
||||
React.useEffect(() => {
|
||||
const stabilityCheck = async () => {
|
||||
const result = await HttpRepository.jsonRequest<Array<string>>(HttpMethod.GET, '/' + id + '/generation/step-structure.json')
|
||||
const promises = []
|
||||
for (let i = 0; i !== result.length; i++) {
|
||||
const stabilitySubId = i + 1
|
||||
promises.push(HttpRepository.jsonRequest<Array<string>>(HttpMethod.GET, '/' + id + '/generation/stability/' + stabilitySubId + '/geometry.json'))
|
||||
}
|
||||
const stabilityCheck = await (await Promise.allSettled(promises)).map<IStability>((element, index) => {
|
||||
return {
|
||||
status: element.status === 'fulfilled' ? true : false,
|
||||
detail: result[index],
|
||||
}
|
||||
})
|
||||
setStability(stabilityCheck)
|
||||
};
|
||||
stabilityCheck()
|
||||
}, []);
|
||||
return (<div>
|
||||
{stabilityResult != null ? (<>
|
||||
{stabilityResult.map((el, index) => {
|
||||
return (<div><div>{el.detail}</div> <div>{el.status ? (<>Sucses</>) : (<><Button onClick={async () => {
|
||||
await HttpRepository.jsonRequest(HttpMethod.POST, '/assembly/stability/write/computed', {
|
||||
"id": id,
|
||||
"buildNumber": (index + 1).toString()
|
||||
})
|
||||
}}>need input </Button></>)}</div> </div>)
|
||||
})}
|
||||
|
||||
</>) : (<div>loading</div>)}
|
||||
|
||||
</div>);
|
||||
};
|
||||
|
||||
|
|
@ -1,48 +0,0 @@
|
|||
|
||||
import * as React from 'react';
|
||||
import { useParams } from 'react-router-dom';
|
||||
import { HttpRepository, HttpMethod, HttpRoute } from '../../core/repository/http_repository';
|
||||
|
||||
|
||||
export const pathAjaxTopologyScreen = '/topology/adjax/usecase/'
|
||||
export interface IAdjaxMatrix {
|
||||
allParts: string[];
|
||||
firstDetail: string;
|
||||
matrix: StringMap;
|
||||
matrixError: StringMap | null;
|
||||
}
|
||||
interface StringMap { [key: string]: string; }
|
||||
|
||||
|
||||
export const MatrixTopologyAdjaxScreen: React.FunctionComponent = () => {
|
||||
const [matrix, setMatrix] = React.useState<IAdjaxMatrix | null>(null);
|
||||
const param = useParams().id
|
||||
React.useEffect(() => {
|
||||
async function fetchData() {
|
||||
setMatrix(
|
||||
await HttpRepository.jsonRequest<IAdjaxMatrix>(
|
||||
HttpMethod.GET,
|
||||
'/' + param + '/' + HttpRoute.ajaxMatrix
|
||||
)
|
||||
);
|
||||
}
|
||||
fetchData();
|
||||
}, []);
|
||||
return (<div>
|
||||
{matrix === null ? (<>loaded</>) : (<>
|
||||
{matrix.matrixError != null ? (<>
|
||||
{Object.keys(matrix.matrixError).map((keyName, i) => {
|
||||
const m = matrix.matrixError as StringMap;
|
||||
return (
|
||||
<div key={i}>
|
||||
<div>{m[keyName]}</div>
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
</>) : (<>Success</>)}
|
||||
</>)}
|
||||
|
||||
</div>);
|
||||
};
|
||||
|
||||
|
14
asp-review-app/ui/src/global.d.ts
vendored
14
asp-review-app/ui/src/global.d.ts
vendored
|
@ -1,14 +0,0 @@
|
|||
/// <reference types="react-scripts" />
|
||||
import { resources, defaultNS } from './i18n';
|
||||
|
||||
declare module 'i18next' {
|
||||
interface CustomTypeOptions {
|
||||
defaultNS: typeof defaultNS;
|
||||
resources: typeof resources['en'];
|
||||
}
|
||||
}
|
||||
declare module "*.svg" {
|
||||
import { ReactElement, SVGProps } from "react";
|
||||
const content: (props: SVGProps<SVGElement>) => ReactElement;
|
||||
export default content;
|
||||
}
|
|
@ -1,13 +0,0 @@
|
|||
body {
|
||||
margin: 0;
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
|
||||
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
|
||||
sans-serif;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
||||
|
||||
code {
|
||||
font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
|
||||
monospace;
|
||||
}
|
|
@ -1,50 +0,0 @@
|
|||
import { render } from "react-dom";
|
||||
import "./App.css";
|
||||
import "./index.css";
|
||||
import { createBrowserRouter, RouterProvider } from "react-router-dom";
|
||||
import {
|
||||
AssemblyPreviewInsertVector,
|
||||
AssemblyPreviewInsertVectorPath,
|
||||
} from "./features/assembly_preview_insert_vector/Assembly_preview_insert_vector_screen";
|
||||
import {
|
||||
ProjectScreen,
|
||||
ProjectsPath,
|
||||
} from "./features/all_project/all_project_screen";
|
||||
import {
|
||||
AssemblyPreviewSubsequence,
|
||||
AssemblyPreviewSubsequencePath,
|
||||
} from "./features/assembly_preview_subsequence/assembly_preview_subsequence_screen";
|
||||
import CreateProject, { createProjectRoute } from "./features/create_project/create_project";
|
||||
import { pathAjaxTopologyScreen, MatrixTopologyAdjaxScreen } from "./features/topology_ajax_preview/topology_ajax_preview";
|
||||
import { pathStabilityScreen, StabilityPreviewScreen } from "./features/stability_preview/stability_preview";
|
||||
|
||||
const rootElement = document.getElementById("root");
|
||||
|
||||
const router = createBrowserRouter([
|
||||
{
|
||||
path: ProjectsPath,
|
||||
element: <ProjectScreen />,
|
||||
},
|
||||
{
|
||||
path:createProjectRoute,
|
||||
element:<CreateProject/>
|
||||
},
|
||||
{
|
||||
path: AssemblyPreviewSubsequencePath + ":id",
|
||||
element: <AssemblyPreviewSubsequence />,
|
||||
},
|
||||
{
|
||||
path: AssemblyPreviewInsertVectorPath + ":id",
|
||||
element: <AssemblyPreviewInsertVector />,
|
||||
},
|
||||
{
|
||||
path: pathAjaxTopologyScreen + ":id",
|
||||
element:<MatrixTopologyAdjaxScreen/>
|
||||
},
|
||||
{
|
||||
path: pathStabilityScreen + ':id',
|
||||
element:<StabilityPreviewScreen/>
|
||||
}
|
||||
]);
|
||||
|
||||
render(<RouterProvider router={router} />, rootElement);
|
|
@ -1,15 +0,0 @@
|
|||
import { ReportHandler } from 'web-vitals';
|
||||
|
||||
const reportWebVitals = (onPerfEntry?: ReportHandler) => {
|
||||
if (onPerfEntry && onPerfEntry instanceof Function) {
|
||||
import('web-vitals').then(({ getCLS, getFID, getFCP, getLCP, getTTFB }) => {
|
||||
getCLS(onPerfEntry);
|
||||
getFID(onPerfEntry);
|
||||
getFCP(onPerfEntry);
|
||||
getLCP(onPerfEntry);
|
||||
getTTFB(onPerfEntry);
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export default reportWebVitals;
|
|
@ -1,27 +0,0 @@
|
|||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES6",
|
||||
"lib": [
|
||||
"dom",
|
||||
"dom.iterable",
|
||||
"esnext"
|
||||
],
|
||||
"allowJs": true,
|
||||
"skipLibCheck": true,
|
||||
"esModuleInterop": true,
|
||||
"experimentalDecorators": true,
|
||||
"allowSyntheticDefaultImports": true,
|
||||
"strict": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
"module": "esnext",
|
||||
"moduleResolution": "node",
|
||||
"resolveJsonModule": true,
|
||||
"isolatedModules": true,
|
||||
"noEmit": true,
|
||||
"jsx": "react-jsx"
|
||||
},
|
||||
"include": [
|
||||
"src"
|
||||
]
|
||||
}
|
File diff suppressed because it is too large
Load diff
45
asp/main.py
45
asp/main.py
|
@ -1,45 +0,0 @@
|
|||
import argparse
|
||||
import shutil
|
||||
from helper.fs import FS
|
||||
from src.usecases.urdf_sub_assembly_usecase import UrdfSubAssemblyUseCase
|
||||
from src.model.sdf_geometry import GeometryModel
|
||||
from src.usecases.sdf_sub_assembly_usecase import SdfSubAssemblyUseCase
|
||||
|
||||
import os
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('--generationFolder', help='FreeCad generation folder')
|
||||
parser.add_argument('--outPath', help='save SDF path')
|
||||
parser.add_argument('--world', help='adding sdf world')
|
||||
parser.add_argument('--format', help='urdf,sdf,mujoco')
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.generationFolder == None or args.outPath == None:
|
||||
parser.print_help()
|
||||
outPath = args.outPath
|
||||
geometryFiles = FS.readFilesTypeFolder(args.generationFolder + '/assets/')
|
||||
assemblyStructure = FS.readJSON(
|
||||
args.generationFolder + '/step-structure.json')
|
||||
|
||||
geometryModels: list[GeometryModel] = []
|
||||
for el in geometryFiles:
|
||||
geometryModels.append(GeometryModel.from_dict(
|
||||
FS.readJSON(args.generationFolder + '/assets/' + el)))
|
||||
# if os.path.exists(outPath + 'sdf-generation/'):
|
||||
# shutil.rmtree(path=outPath + 'sdf-generation/')
|
||||
|
||||
if (args.format == 'sdf'):
|
||||
SdfSubAssemblyUseCase().call(
|
||||
geometryModels=geometryModels, assembly=assemblyStructure,
|
||||
world=args.world,
|
||||
generationFolder=args.generationFolder,
|
||||
outPath=args.outPath
|
||||
)
|
||||
if (args.format == 'urdf'):
|
||||
UrdfSubAssemblyUseCase().call(
|
||||
geometryModels=geometryModels, assembly=assemblyStructure,
|
||||
world=args.world,
|
||||
generationFolder=args.generationFolder,
|
||||
outPath=args.outPath
|
||||
)
|
|
@ -1,2 +0,0 @@
|
|||
class Enum:
|
||||
folderPath = 'sdf-generation/';
|
|
@ -1,181 +0,0 @@
|
|||
import os
|
||||
from helper.fs import FS
|
||||
|
||||
from src.model.sdf_join import SdfJoin
|
||||
import typing
|
||||
import uuid
|
||||
|
||||
|
||||
def from_str(x):
|
||||
assert isinstance(x, str)
|
||||
return x
|
||||
|
||||
|
||||
def from_none(x):
|
||||
assert x is None
|
||||
return x
|
||||
|
||||
|
||||
def from_union(fs, x):
|
||||
for f in fs:
|
||||
try:
|
||||
return f(x)
|
||||
except:
|
||||
pass
|
||||
assert False
|
||||
|
||||
|
||||
def to_class(c, x):
|
||||
assert isinstance(x, c)
|
||||
return x.to_dict()
|
||||
|
||||
|
||||
DELIMITER_SCALE = 10000
|
||||
|
||||
|
||||
class GeometryModel:
|
||||
def __init__(self, name, ixx, ixy, ixz, iyy, izz, massSDF, posX, posY, posZ, eulerX, eulerY, eulerZ, iyz, stl, link, friction, centerMassX, centerMassY, centerMassZ):
|
||||
self.name = name
|
||||
self.ixx = ixx
|
||||
self.ixy = ixy
|
||||
self.ixz = ixz
|
||||
self.iyy = iyy
|
||||
self.izz = izz
|
||||
self.massSDF = massSDF
|
||||
self.posX = posX
|
||||
self.posY = posY
|
||||
self.posZ = posZ
|
||||
self.eulerX = eulerX
|
||||
self.eulerY = eulerY
|
||||
self.eulerZ = eulerZ
|
||||
self.iyz = iyz
|
||||
self.stl = stl
|
||||
self.link = link
|
||||
self.friction = friction
|
||||
self.centerMassX = centerMassX
|
||||
self.centerMassY = centerMassY
|
||||
self.centerMassZ = centerMassZ
|
||||
|
||||
@staticmethod
|
||||
def from_dict(obj):
|
||||
assert isinstance(obj, dict)
|
||||
name = from_union([from_str, from_none], obj.get("name"))
|
||||
ixx = from_union([from_str, from_none], obj.get("ixx"))
|
||||
ixy = from_union([from_str, from_none], obj.get("ixy"))
|
||||
ixz = from_union([from_str, from_none], obj.get("ixz"))
|
||||
iyy = from_union([from_str, from_none], obj.get("iyy"))
|
||||
izz = from_union([from_str, from_none], obj.get("izz"))
|
||||
massSDF = from_union([from_str, from_none], obj.get("massSDF"))
|
||||
posX = from_union([from_str, from_none], obj.get("posX"))
|
||||
posY = from_union([from_str, from_none], obj.get("posY"))
|
||||
posZ = from_union([from_str, from_none], obj.get("posZ"))
|
||||
eulerX = from_union([from_str, from_none], obj.get("eulerX"))
|
||||
eulerY = from_union([from_str, from_none], obj.get("eulerY"))
|
||||
eulerZ = from_union([from_str, from_none], obj.get("eulerZ"))
|
||||
iyz = from_union([from_str, from_none], obj.get("iyz"))
|
||||
stl = from_union([from_str, from_none], obj.get("stl"))
|
||||
link = from_union([from_str, from_none], obj.get('link'))
|
||||
friction = from_union([from_str, from_none], obj.get("friction"))
|
||||
centerMassX = from_union([from_str, from_none], obj.get("centerMassX"))
|
||||
centerMassY = from_union([from_str, from_none], obj.get("centerMassY"))
|
||||
centerMassZ = from_union([from_str, from_none], obj.get("centerMassZ"))
|
||||
return GeometryModel(name, ixx, ixy, ixz, iyy, izz, massSDF, posX, posY, posZ, eulerX, eulerY, eulerZ, iyz, stl, link, friction, centerMassX, centerMassY, centerMassZ)
|
||||
|
||||
def to_dict(self):
|
||||
result = {}
|
||||
if self.name is not None:
|
||||
result["name"] = from_union([from_str, from_none], self.name)
|
||||
if self.ixx is not None:
|
||||
result["ixx"] = from_union([from_str, from_none], self.ixx)
|
||||
if self.ixy is not None:
|
||||
result["ixy"] = from_union([from_str, from_none], self.ixy)
|
||||
if self.ixz is not None:
|
||||
result["ixz"] = from_union([from_str, from_none], self.ixz)
|
||||
if self.iyy is not None:
|
||||
result["iyy"] = from_union([from_str, from_none], self.iyy)
|
||||
if self.izz is not None:
|
||||
result["izz"] = from_union([from_str, from_none], self.izz)
|
||||
if self.massSDF is not None:
|
||||
result["massSDF"] = from_union([from_str, from_none], self.massSDF)
|
||||
if self.posX is not None:
|
||||
result["posX"] = from_union([from_str, from_none], self.posX)
|
||||
if self.posY is not None:
|
||||
result["posY"] = from_union([from_str, from_none], self.posY)
|
||||
if self.posZ is not None:
|
||||
result["posZ"] = from_union([from_str, from_none], self.posZ)
|
||||
if self.eulerX is not None:
|
||||
result["eulerX"] = from_union([from_str, from_none], self.eulerX)
|
||||
if self.eulerY is not None:
|
||||
result["eulerY"] = from_union([from_str, from_none], self.eulerY)
|
||||
if self.eulerZ is not None:
|
||||
result["eulerZ"] = from_union([from_str, from_none], self.eulerZ)
|
||||
if self.iyz is not None:
|
||||
result["iyz"] = from_union([from_str, from_none], self.iyz)
|
||||
if self.stl is not None:
|
||||
result["stl"] = from_union([from_str, from_none], self.stl)
|
||||
if self.link is not None:
|
||||
result['link'] = from_union([from_str, from_none], self.link)
|
||||
if self.friction is not None:
|
||||
result["friction"] = from_union([from_str, from_none], self.eulerZ)
|
||||
if self.centerMassX is not None:
|
||||
result['centerMassX'] = from_union(
|
||||
[from_str, from_none], self.centerMassX)
|
||||
if self.centerMassY is not None:
|
||||
result['centerMassY'] = from_union(
|
||||
[from_str, from_none], self.centerMassY)
|
||||
if self.centerMassZ is not None:
|
||||
result['centerMassZ'] = from_union(
|
||||
[from_str, from_none], self.centerMassZ)
|
||||
return result
|
||||
|
||||
def toJSON(self) -> str:
|
||||
return str(self.to_dict()).replace('\'', '"')
|
||||
|
||||
def toSDF(self):
|
||||
return FS.readFile(os.path.dirname(os.path.realpath(__file__))
|
||||
+ '/../../mocks/sdf/model.sdf').replace('{name}', self.name,).replace('{posX}', self.posX).replace('{posY}', self.posY).replace('{posZ}', self.posZ).replace('{eulerX}', self.eulerX).replace('{eulerY}', self.eulerY).replace('{eulerZ}', self.eulerZ).replace('{ixx}', self.ixx).replace('{ixy}', self.ixy).replace('{ixz}', self.ixz).replace('{iyy}', self.iyy).replace('{iyz}', self.iyz).replace('{izz}', self.izz).replace('{massSDF}', self.massSDF,).replace('{stl}', self.stl).replace('{friction}', self.friction)
|
||||
|
||||
def toSdfLink(self):
|
||||
|
||||
return FS.readFile(os.path.dirname(os.path.realpath(__file__))
|
||||
+ '/../../mocks/sdf/link.sdf').replace('{name}', self.name,).replace('{posX}', self.posX).replace('{posY}', self.posY).replace('{posZ}', self.posZ).replace('{eulerX}', self.eulerX).replace('{eulerY}', self.eulerY).replace('{eulerZ}', self.eulerZ).replace('{ixx}', self.ixx).replace('{ixy}', self.ixy).replace('{ixz}', self.ixz).replace('{iyy}', self.iyy).replace('{iyz}', self.iyz).replace('{izz}', self.izz).replace('{massSDF}', self.massSDF,).replace('{stl}', self.stl).replace('{friction}', self.friction)
|
||||
|
||||
def includeLink(self, pose=False):
|
||||
if (pose == False):
|
||||
return FS.readFile(os.path.dirname(os.path.realpath(__file__))
|
||||
+ '/../../mocks/sdf/include.sdf').replace('{name}', self.name).replace('{uri}', '/' + self.name)
|
||||
return FS.readFile(os.path.dirname(os.path.realpath(__file__))
|
||||
+ '/../../mocks/sdf/include_pose.sdf').replace('{name}', self.name).replace('{uri}', '/' + self.name).replace('{posX}', self.posX).replace('{posY}', self.posY).replace('{posZ}', self.posZ).replace('{eulerX}', self.eulerX).replace('{eulerY}', self.eulerY).replace('{eulerZ}', self.eulerZ).replace('{ixx}', self.ixx).replace('{ixy}', self.ixy).replace('{ixz}', self.ixz).replace('{iyy}', self.iyy).replace('{iyz}', self.iyz).replace('{izz}', self.izz)
|
||||
|
||||
def generateSDFatJoinFixed(self, sdfModels: list['GeometryModel']):
|
||||
sdf = '\n<model name="assembly">\n'
|
||||
sdf += ' <link name="base_link">\n'
|
||||
sdf += " <pose>0 0 0 0 0 0</pose>\n"
|
||||
sdf += " </link>\n"
|
||||
|
||||
link = sdf + self.includeLink(pose=True)
|
||||
if sdfModels.__len__() == 0:
|
||||
return link
|
||||
endTagLinkInc = link.__len__()
|
||||
beginSDF = link[0: endTagLinkInc]
|
||||
|
||||
sdfJoin = beginSDF + '\n'
|
||||
|
||||
for el in sdfModels:
|
||||
if el.name != self.name:
|
||||
sdfJoin += el.includeLink(pose=True) + '\n'
|
||||
|
||||
endSDF = link[endTagLinkInc:link.__len__()]
|
||||
|
||||
for el in sdfModels:
|
||||
if el.name != self.name:
|
||||
sdfJoin += SdfJoin(name=str(uuid.uuid4()),
|
||||
parent=self.name, child=el.name, modelAt=el).toSDF() + '\n'
|
||||
|
||||
sdfJoin += endSDF
|
||||
sdfJoin += '</model>'
|
||||
return sdfJoin
|
||||
|
||||
def toUrdf(self):
|
||||
return FS.readFile(os.path.dirname(os.path.realpath(__file__))
|
||||
+ '/../../mocks/urdf/model.urdf').replace('{name}', self.name).replace('{name}', self.name).replace('{uri}', '/' + self.name).replace('{posX}', self.posX).replace('{posY}', self.posY).replace('{posZ}', self.posZ).replace('{eulerX}', self.eulerX).replace('{eulerY}', self.eulerY).replace('{eulerZ}', self.eulerZ).replace('{ixx}', self.ixx).replace('{ixy}', self.ixy).replace('{ixz}', self.ixz).replace('{iyy}', self.iyy).replace('{iyz}', self.iyz).replace('{izz}', self.izz).replace('{stl}', '/' + self.stl).replace('{massSDF}', self.massSDF).replace('{centerMassX}', self.centerMassX).replace('{centerMassY}', self.centerMassY).replace('{centerMassZ}', self.centerMassZ)
|
|
@ -1,12 +0,0 @@
|
|||
import os
|
||||
from helper.fs import FS
|
||||
|
||||
class SdfGenerateWorldUseCase:
|
||||
def call(assembly:str) -> str:
|
||||
world = FS.readFile(os.path.dirname(os.path.realpath(__file__))
|
||||
+ '/../../mocks/sdf/world.sdf')
|
||||
beginWorld = world[0:world.find('</world') - 1]
|
||||
endWorld = world[world.find('</world') - 1: world.__len__()]
|
||||
|
||||
|
||||
return beginWorld + assembly + endWorld
|
|
@ -1,57 +0,0 @@
|
|||
import os
|
||||
from typing import Optional
|
||||
from helper.fs import FS
|
||||
from helper.fs import filterModels, listGetFirstValue
|
||||
from src.model.asm import Assembly
|
||||
from src.model.enum import Enum
|
||||
from src.usecases.formatter_usecase import FormatterUseCase
|
||||
from src.usecases.sdf_generate_world_usecase import SdfGenerateWorldUseCase
|
||||
from src.model.sdf_geometry import GeometryModel
|
||||
from distutils.dir_util import copy_tree
|
||||
|
||||
SDF_FILE_FORMAT = '.sdf'
|
||||
CONFIG_PATH = os.path.dirname(os.path.realpath(
|
||||
__file__)) + '/../../mocks/sdf/model.config'
|
||||
|
||||
|
||||
|
||||
|
||||
class SdfSubAssemblyUseCase(Assembly):
|
||||
|
||||
def call(self, geometryModels: list[GeometryModel], assembly: list[str], outPath: str, generationFolder: str, world: bool):
|
||||
asm = {}
|
||||
generateSubAssemblyModels = self.generateSubAssembly(assembly)
|
||||
inc = 0
|
||||
for key, value in generateSubAssemblyModels.items():
|
||||
inc += 1
|
||||
if value['assembly'].__len__() != 0:
|
||||
|
||||
model: Optional[GeometryModel] = listGetFirstValue(
|
||||
geometryModels, None, lambda x: x.name == value['assembly'][0])
|
||||
|
||||
if model != None:
|
||||
|
||||
asm[key] = {"assembly": model.generateSDFatJoinFixed(filterModels(geometryModels, value['assembly'])), "part": (
|
||||
listGetFirstValue(geometryModels, None, lambda x: x.name == value['part'])).includeLink()}
|
||||
|
||||
self.copy(generationFolder=
|
||||
generationFolder, format='/sdf', outPath=outPath)
|
||||
dirPath = outPath + Enum.folderPath
|
||||
for el in geometryModels:
|
||||
path = dirPath + el.name + '/'
|
||||
os.makedirs(path)
|
||||
FS.writeFile(data=el.toSDF(), filePath=path,
|
||||
fileName='/model' + SDF_FILE_FORMAT)
|
||||
FS.writeFile(data=FS.readFile(CONFIG_PATH),
|
||||
filePath=path, fileName='/model' + '.config')
|
||||
|
||||
for key, v in asm.items():
|
||||
FS.writeFile(data=v['assembly'], filePath=dirPath,
|
||||
fileName='/' + key + SDF_FILE_FORMAT)
|
||||
|
||||
else:
|
||||
for key, v in asm.items():
|
||||
FS.writeFile(data=SdfGenerateWorldUseCase.call(v['assembly']), filePath=dirPath,
|
||||
fileName='/' + key + SDF_FILE_FORMAT)
|
||||
|
||||
FormatterUseCase.call(outPath=outPath, format=SDF_FILE_FORMAT)
|
|
@ -1,19 +0,0 @@
|
|||
from helper.fs import FS
|
||||
from src.model.enum import Enum
|
||||
from src.model.asm import Assembly
|
||||
from src.model.sdf_geometry import GeometryModel
|
||||
import json
|
||||
import re
|
||||
|
||||
|
||||
URDF_FILE_FORMAT = '.urdf'
|
||||
URDF_GENERATOR_FILE = 'urdf-generation' + '.json'
|
||||
|
||||
class UrdfSubAssemblyUseCase(Assembly):
|
||||
def call(self, geometryModels: list[GeometryModel], assembly: list[str], outPath: str, generationFolder: str, world: bool):
|
||||
dirPath = generationFolder + Enum.folderPath
|
||||
asm = {}
|
||||
for el in geometryModels:
|
||||
asm[el.name] = el.toUrdf()
|
||||
FS.writeFile(data=json.dumps(asm,indent=4),
|
||||
fileName=URDF_GENERATOR_FILE, filePath=dirPath)
|
|
@ -1,6 +0,0 @@
|
|||
{
|
||||
"doc": "/home/idontsudo/framework/asp/out/disk_and_axis_n.FCStd",
|
||||
"out": "/home/idontsudo/framework/asp/out",
|
||||
"resultURL": "http://localhost:3002/assembly/save/out",
|
||||
"projectId": "cubes"
|
||||
}
|
|
@ -1,15 +0,0 @@
|
|||
import os
|
||||
import json
|
||||
|
||||
|
||||
class FS:
|
||||
def readJSON(path: str):
|
||||
return json.loads((open(path)).read())
|
||||
|
||||
def writeFile(data, filePath, fileName):
|
||||
file_to_open = filePath + fileName
|
||||
|
||||
f = open(file_to_open, 'w', encoding='utf-8',
|
||||
errors='ignore')
|
||||
f.write(data)
|
||||
f.close()
|
|
@ -1,18 +0,0 @@
|
|||
import FreeCAD
|
||||
|
||||
|
||||
def is_object_solid(obj):
|
||||
"""If obj is solid return True"""
|
||||
if not isinstance(obj, FreeCAD.DocumentObject):
|
||||
return False
|
||||
|
||||
if not hasattr(obj, 'Shape'):
|
||||
return False
|
||||
|
||||
if not hasattr(obj.Shape, 'Solids'):
|
||||
return False
|
||||
|
||||
if len(obj.Shape.Solids) == 0:
|
||||
return False
|
||||
|
||||
return True
|
|
@ -1,19 +0,0 @@
|
|||
import requests
|
||||
import FreeCAD as App
|
||||
from helper.fs import FS
|
||||
from scenarios.robossembler_freecad_export_scenario import RobossemblerFreeCadExportScenario
|
||||
import shutil
|
||||
import os
|
||||
import FreeCADGui as Gui
|
||||
|
||||
|
||||
def main():
|
||||
env = FS.readJSON('./env.json')
|
||||
App.openDocument(env.get('doc'))
|
||||
RobossemblerFreeCadExportScenario().call(env.get('out'))
|
||||
# requests.post(url=env.get('resultURL'), files={'zip': open(env.get('out') + '/' + 'generation.zip', "rb"), 'id':env.get('projectId')})
|
||||
# os.remove('./generation.zip')
|
||||
App.closeDocument(App.ActiveDocument.Name)
|
||||
freecadQTWindow = Gui.getMainWindow()
|
||||
freecadQTWindow.close()
|
||||
main()
|
|
@ -1,13 +0,0 @@
|
|||
from enum import Enum
|
||||
|
||||
|
||||
class FilesGenerator(Enum):
|
||||
DETAIL = 'detail.json'
|
||||
ASSEMBLY = 'assembly.json'
|
||||
|
||||
|
||||
class FolderGenerator(Enum):
|
||||
MESHES = 'meshes'
|
||||
ASSETS = 'assets'
|
||||
SDF = 'sdf'
|
||||
ASSEMBlY = 'assembly'
|
|
@ -1,86 +0,0 @@
|
|||
from typing import Any, TypeVar, Type, cast
|
||||
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
def from_float(x: Any) -> float:
|
||||
assert isinstance(x, (float, int)) and not isinstance(x, bool)
|
||||
return float(x)
|
||||
|
||||
|
||||
def to_float(x: Any) -> float:
|
||||
assert isinstance(x, float)
|
||||
return x
|
||||
|
||||
|
||||
def to_class(c: Type[T], x: Any) -> dict:
|
||||
assert isinstance(x, c)
|
||||
return cast(Any, x).to_dict()
|
||||
|
||||
|
||||
class Axis:
|
||||
x: float
|
||||
y: float
|
||||
z: float
|
||||
|
||||
def __init__(self, x: float, y: float, z: float) -> None:
|
||||
self.x = x
|
||||
self.y = y
|
||||
self.z = z
|
||||
|
||||
@staticmethod
|
||||
def from_dict(obj: Any) -> 'Axis':
|
||||
assert isinstance(obj, dict)
|
||||
x = from_float(obj.get("x"))
|
||||
y = from_float(obj.get("y"))
|
||||
z = from_float(obj.get("z"))
|
||||
return Axis(x, y, z)
|
||||
|
||||
def to_dict(self) -> dict:
|
||||
result: dict = {}
|
||||
result["x"] = to_float(self.x)
|
||||
result["y"] = to_float(self.y)
|
||||
result["z"] = to_float(self.z)
|
||||
return result
|
||||
|
||||
|
||||
class GeometryPart:
|
||||
euler: Axis
|
||||
position: Axis
|
||||
rotation: Axis
|
||||
center: Axis
|
||||
|
||||
def __init__(self, euler: Axis, position: Axis, rotation: Axis, center: Axis) -> None:
|
||||
self.euler = euler
|
||||
self.position = position
|
||||
self.rotation = rotation
|
||||
self.center = center
|
||||
|
||||
@staticmethod
|
||||
def from_dict(obj: Any) -> 'GeometryPart':
|
||||
assert isinstance(obj, dict)
|
||||
euler = Axis.from_dict(obj.get("euler"))
|
||||
position = Axis.from_dict(obj.get("position"))
|
||||
rotation = Axis.from_dict(obj.get("rotation"))
|
||||
center = Axis.from_dict(obj.get("center"))
|
||||
return GeometryPart(euler, position, rotation, center)
|
||||
|
||||
def to_dict(self) -> dict:
|
||||
result: dict = {}
|
||||
result["euler"] = to_class(Axis, self.euler)
|
||||
result["position"] = to_class(Axis, self.position)
|
||||
result["rotation"] = to_class(Axis, self.rotation)
|
||||
result["center"] = to_class(Axis, self.center)
|
||||
return result
|
||||
|
||||
def toJson(self) -> str:
|
||||
return str(self.to_dict()).replace('\'', '"')
|
||||
|
||||
|
||||
def geometry_part_from_dict(s: Any) -> GeometryPart:
|
||||
return GeometryPart.from_dict(s)
|
||||
|
||||
|
||||
def geometry_part_to_dict(x: GeometryPart) -> Any:
|
||||
return to_class(GeometryPart, x)
|
|
@ -1,33 +0,0 @@
|
|||
import FreeCAD
|
||||
import Mesh
|
||||
import FreeCAD as App
|
||||
from model.mesh_part_model import MeshPartModel
|
||||
|
||||
|
||||
class JoinMeshModel:
|
||||
id = None
|
||||
mesh = None
|
||||
|
||||
def __init__(self, meshesPartModels: list['MeshPartModel']) -> None:
|
||||
meshes = []
|
||||
import Mesh
|
||||
from random import randrange
|
||||
for el in meshesPartModels:
|
||||
meshes.append(el.mesh.Mesh)
|
||||
|
||||
self.id = 'MergedMesh' + str(randrange(1000000))
|
||||
document = App.ActiveDocument
|
||||
merged_mesh = Mesh.Mesh()
|
||||
for el in meshes:
|
||||
merged_mesh.addMesh(el)
|
||||
|
||||
new_obj = App.activeDocument().addObject("Mesh::Feature", self.id)
|
||||
new_obj.Mesh = merged_mesh
|
||||
new_obj.ViewObject.DisplayMode = "Flat Lines" # Set display mode to flat lines
|
||||
self.mesh = new_obj
|
||||
|
||||
def remove(self):
|
||||
try:
|
||||
App.ActiveDocument.removeObject(self.id)
|
||||
except Exception as e:
|
||||
print(e)
|
|
@ -1,32 +0,0 @@
|
|||
import FreeCAD as App
|
||||
import uuid
|
||||
import Mesh
|
||||
import Part
|
||||
# import PartGui
|
||||
import MeshPart
|
||||
|
||||
|
||||
class MeshPartModel:
|
||||
id = None
|
||||
mesh = None
|
||||
|
||||
def __init__(self, part) -> None:
|
||||
try:
|
||||
from random import randrange
|
||||
self.id = 'mesh' + str(randrange(1000000))
|
||||
document = App.ActiveDocument
|
||||
mesh = document.addObject("Mesh::Feature", self.id)
|
||||
shape = Part.getShape(part, "")
|
||||
mesh.Mesh = MeshPart.meshFromShape(
|
||||
Shape=shape, LinearDeflection=20, AngularDeflection=0.1, Relative=False)
|
||||
mesh.Label = self.id
|
||||
self.mesh = mesh
|
||||
except Exception as e:
|
||||
print(e)
|
||||
pass
|
||||
|
||||
def remove(self):
|
||||
try:
|
||||
App.ActiveDocument.removeObject(self.mesh.Label)
|
||||
except Exception as e:
|
||||
print(e)
|
|
@ -1,118 +0,0 @@
|
|||
import json
|
||||
|
||||
|
||||
def from_str(x):
|
||||
assert isinstance(x, str)
|
||||
return x
|
||||
|
||||
|
||||
def from_none(x):
|
||||
assert x is None
|
||||
return x
|
||||
|
||||
|
||||
def from_union(fs, x):
|
||||
for f in fs:
|
||||
try:
|
||||
return f(x)
|
||||
except:
|
||||
pass
|
||||
assert False
|
||||
|
||||
|
||||
def to_class(c, x):
|
||||
assert isinstance(x, c)
|
||||
return x.to_dict()
|
||||
|
||||
|
||||
class SdfGeometryModel:
|
||||
def __init__(self, name, ixx, ixy, ixz, iyy, izz, massSDF, posX, posY, posZ, eulerX, eulerY, eulerZ, iyz, stl, friction, centerMassX, centerMassY, centerMassZ,):
|
||||
self.name = name
|
||||
self.ixx = ixx
|
||||
self.ixy = ixy
|
||||
self.ixz = ixz
|
||||
self.iyy = iyy
|
||||
self.izz = izz
|
||||
self.massSDF = massSDF
|
||||
self.posX = posX
|
||||
self.posY = posY
|
||||
self.posZ = posZ
|
||||
self.eulerX = eulerX
|
||||
self.eulerY = eulerY
|
||||
self.eulerZ = eulerZ
|
||||
self.iyz = iyz
|
||||
self.stl = stl
|
||||
self.friction = friction
|
||||
self.centerMassX = centerMassX
|
||||
self.centerMassY = centerMassY
|
||||
self.centerMassZ = centerMassZ
|
||||
|
||||
@staticmethod
|
||||
def from_dict(obj):
|
||||
assert isinstance(obj, dict)
|
||||
name = from_union([from_str, from_none], obj.get("name"))
|
||||
ixx = from_union([from_str, from_none], obj.get("ixx"))
|
||||
ixy = from_union([from_str, from_none], obj.get("ixy"))
|
||||
ixz = from_union([from_str, from_none], obj.get("ixz"))
|
||||
iyy = from_union([from_str, from_none], obj.get("iyy"))
|
||||
izz = from_union([from_str, from_none], obj.get("izz"))
|
||||
massSDF = from_union([from_str, from_none], obj.get("massSDF"))
|
||||
posX = from_union([from_str, from_none], obj.get("posX"))
|
||||
posY = from_union([from_str, from_none], obj.get("posY"))
|
||||
posZ = from_union([from_str, from_none], obj.get("posZ"))
|
||||
eulerX = from_union([from_str, from_none], obj.get("eulerX"))
|
||||
eulerY = from_union([from_str, from_none], obj.get("eulerY"))
|
||||
eulerZ = from_union([from_str, from_none], obj.get("eulerZ"))
|
||||
iyz = from_union([from_str, from_none], obj.get("iyz"))
|
||||
stl = from_union([from_str, from_none], obj.get("stl"))
|
||||
friction = from_union([from_str, from_none], obj.get("friction"))
|
||||
centerMassX = from_union([from_str, from_none], obj.get("centerMassX"))
|
||||
centerMassY = from_union([from_str, from_none], obj.get("centerMassY"))
|
||||
centerMassZ = from_union([from_str, from_none], obj.get("centerMassZ"))
|
||||
return SdfGeometryModel(name, ixx, ixy, ixz, iyy, izz, massSDF, posX, posY, posZ, eulerX, eulerY, eulerZ, iyz, stl, friction, centerMassX, centerMassY, centerMassZ)
|
||||
|
||||
def to_dict(self):
|
||||
result = {}
|
||||
if self.name is not None:
|
||||
result["name"] = from_union([from_str, from_none], self.name)
|
||||
if self.ixx is not None:
|
||||
result["ixx"] = from_union([from_str, from_none], self.ixx)
|
||||
if self.ixy is not None:
|
||||
result["ixy"] = from_union([from_str, from_none], self.ixy)
|
||||
if self.ixz is not None:
|
||||
result["ixz"] = from_union([from_str, from_none], self.ixz)
|
||||
if self.iyy is not None:
|
||||
result["iyy"] = from_union([from_str, from_none], self.iyy)
|
||||
if self.izz is not None:
|
||||
result["izz"] = from_union([from_str, from_none], self.izz)
|
||||
if self.massSDF is not None:
|
||||
result["massSDF"] = from_union([from_str, from_none], self.massSDF)
|
||||
if self.posX is not None:
|
||||
result["posX"] = from_union([from_str, from_none], self.posX)
|
||||
if self.posY is not None:
|
||||
result["posY"] = from_union([from_str, from_none], self.posY)
|
||||
if self.posZ is not None:
|
||||
result["posZ"] = from_union([from_str, from_none], self.posZ)
|
||||
if self.eulerX is not None:
|
||||
result["eulerX"] = from_union([from_str, from_none], self.eulerX)
|
||||
if self.eulerY is not None:
|
||||
result["eulerY"] = from_union([from_str, from_none], self.eulerY)
|
||||
if self.eulerZ is not None:
|
||||
result["eulerZ"] = from_union([from_str, from_none], self.eulerZ)
|
||||
if self.iyz is not None:
|
||||
result["iyz"] = from_union([from_str, from_none], self.iyz)
|
||||
if self.stl is not None:
|
||||
result["stl"] = from_union([from_str, from_none], self.stl)
|
||||
if self.friction is not None:
|
||||
result["friction"] = from_union([from_str, from_none], self.eulerZ)
|
||||
|
||||
if self.centerMassX is not None:
|
||||
result['centerMassX'] = from_union([from_str, from_none], self.centerMassX)
|
||||
if self.centerMassY is not None:
|
||||
result['centerMassY'] = from_union([from_str, from_none], self.centerMassY)
|
||||
if self.centerMassZ is not None:
|
||||
result['centerMassZ'] = from_union([from_str, from_none], self.centerMassZ)
|
||||
return result
|
||||
|
||||
def toJSON(self) -> str:
|
||||
return str(self.to_dict()).replace('\'', '"')
|
|
@ -1,30 +0,0 @@
|
|||
import FreeCAD as App
|
||||
import Part
|
||||
|
||||
|
||||
class SimpleCopyPartModel:
|
||||
id = None
|
||||
copyLink = None
|
||||
label = None
|
||||
part = None
|
||||
|
||||
def getPart(self):
|
||||
return self.part
|
||||
|
||||
def __init__(self, part) -> None:
|
||||
try:
|
||||
from random import randrange
|
||||
self.id = str(randrange(1000000))
|
||||
childObj = part
|
||||
__shape = Part.getShape(
|
||||
childObj, '', needSubElement=False, refine=False)
|
||||
obj = App.ActiveDocument.addObject('Part::Feature', self.id)
|
||||
obj.Shape = __shape
|
||||
self.part = obj
|
||||
self.label = obj.Label
|
||||
App.ActiveDocument.recompute()
|
||||
except Exception as e:
|
||||
print(e)
|
||||
|
||||
def remove(self):
|
||||
App.ActiveDocument.removeObject(self.label)
|
|
@ -1,52 +0,0 @@
|
|||
|
||||
from usecases.export_assembly_them_all_usecase import ExportAssemblyThemAllUseCase
|
||||
import FreeCAD
|
||||
from usecases.export_usecase import EXPORT_TYPES, ExportUseCase
|
||||
from usecases.get_sdf_geometry_usecase import SdfGeometryUseCase
|
||||
from usecases.assembly_parse_usecase import AssemblyParseUseCase
|
||||
from usecases.geometry_usecase import GeometryUseCase
|
||||
from model.geometry_part import GeometryPart
|
||||
from model.files_generator import FolderGenerator
|
||||
from helper.fs import FS
|
||||
import os
|
||||
import shutil
|
||||
|
||||
|
||||
class RobossemblerFreeCadExportScenario:
|
||||
|
||||
def call(self, path):
|
||||
|
||||
|
||||
directory = path + '/' + 'generation'
|
||||
if os.path.exists(directory):
|
||||
shutil.rmtree(directory)
|
||||
if not os.path.exists(directory):
|
||||
os.makedirs(directory)
|
||||
|
||||
__objs__ = FreeCAD.ActiveDocument.RootObjects
|
||||
directoryExport = directory + '/'
|
||||
os.makedirs(directoryExport + FolderGenerator.ASSETS.value)
|
||||
|
||||
os.makedirs(directoryExport + FolderGenerator.SDF.value)
|
||||
os.makedirs(directoryExport + FolderGenerator.SDF.value + '/' + FolderGenerator.MESHES.value)
|
||||
os.makedirs(directoryExport + FolderGenerator.ASSEMBlY.value)
|
||||
f = open(directory + "/step-structure.json", "w")
|
||||
f.write(AssemblyParseUseCase().toJson())
|
||||
f.close()
|
||||
self.geometry(directory)
|
||||
ExportAssemblyThemAllUseCase().call(directoryExport)
|
||||
|
||||
# shutil.make_archive(directory, 'zip', directory)
|
||||
|
||||
# shutil.rmtree(directory)
|
||||
return True
|
||||
|
||||
def geometry(self, outPutsPath: str):
|
||||
exportUseCase = ExportUseCase.call(outPutsPath,EXPORT_TYPES.OBJ)
|
||||
for el in SdfGeometryUseCase().call(exportUseCase):
|
||||
FS.writeFile(el.toJSON(), outPutsPath + '/' + FolderGenerator.ASSETS.value + '/', el.name + '.json',)
|
||||
|
||||
|
||||
|
||||
|
||||
|
|
@ -1,53 +0,0 @@
|
|||
import FreeCAD as App
|
||||
|
||||
class Asm4StructureParseUseCase:
|
||||
_parts = []
|
||||
_label = []
|
||||
|
||||
def getSubPartsLabel(self, group):
|
||||
groupLabel = []
|
||||
for el in group:
|
||||
if str(el) == '<Part::PartFeature>':
|
||||
groupLabel.append(el.Label)
|
||||
return groupLabel
|
||||
|
||||
def parseLabel(self, nextGroup, label, level=2, nextGroupParse=0):
|
||||
if nextGroup.__len__() == nextGroupParse:
|
||||
return
|
||||
else:
|
||||
groupParts = []
|
||||
|
||||
for el in nextGroup:
|
||||
if str(el) == '<App::Link object>':
|
||||
groupParts.append(el)
|
||||
|
||||
for el in groupParts:
|
||||
if str(el) == '<App::Link object>':
|
||||
label.append({
|
||||
"level": level,
|
||||
"attachedTo": el.AttachedTo.split('#'),
|
||||
"label": el.Label,
|
||||
"axis": self.getSubPartsLabel(el.Group)
|
||||
})
|
||||
|
||||
def initParse(self):
|
||||
|
||||
model = App.ActiveDocument.RootObjects[1]
|
||||
self._label.append({
|
||||
"level": 1,
|
||||
"attachedTo": "Parent Assembly",
|
||||
"label": model.Label,
|
||||
"axis": self.getSubPartsLabel(model.Group)
|
||||
})
|
||||
for parent in model.Group:
|
||||
if str(parent) == '<App::Link object>':
|
||||
self._label.append({
|
||||
"level": 1,
|
||||
"attachedTo": parent.AttachedTo.split('#'),
|
||||
"label": parent.Label,
|
||||
"axis": self.getSubPartsLabel(parent.Group)
|
||||
})
|
||||
print(self._label)
|
||||
|
||||
|
||||
|
|
@ -1,58 +0,0 @@
|
|||
import FreeCAD as App
|
||||
def is_object_solid(obj):
|
||||
"""If obj is solid return True"""
|
||||
if not isinstance(obj, App.DocumentObject):
|
||||
return False
|
||||
if hasattr(obj, 'Group'):
|
||||
return False
|
||||
|
||||
if not hasattr(obj, 'Shape'):
|
||||
return False
|
||||
# if not hasattr(obj.Shape, 'Mass'):
|
||||
# return False
|
||||
if not hasattr(obj.Shape, 'Solids'):
|
||||
return False
|
||||
|
||||
if len(obj.Shape.Solids) == 0:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
class AssemblyParseUseCase:
|
||||
_parts = []
|
||||
|
||||
_asm = []
|
||||
|
||||
def getAsm(self):
|
||||
return self._asm
|
||||
|
||||
def __init__(self) -> None:
|
||||
if (self._asm.__len__() == 0):
|
||||
self.initParse()
|
||||
pass
|
||||
|
||||
def initParse(self):
|
||||
for el in App.ActiveDocument.Objects:
|
||||
if (is_object_solid(el)):
|
||||
self._asm.append(el.Label)
|
||||
|
||||
def toJson(self):
|
||||
return str(self._asm).replace('\'', "\"")
|
||||
|
||||
def getSubPartsLink(self, group):
|
||||
groupLink = {}
|
||||
for el in group:
|
||||
if (is_object_solid(el)):
|
||||
if str(el.Shape).find('Solid') != -1:
|
||||
if groupLink.get(el.Label) == None:
|
||||
groupLink[el.Label] = []
|
||||
for i in el.Group:
|
||||
if str(i).find('Pad') != -1:
|
||||
groupLink[el.Label].append(i)
|
||||
if groupLink.__len__() == 0:
|
||||
return None
|
||||
return groupLink
|
||||
|
||||
def getLinkedProperty(self):
|
||||
return self._asm
|
|
@ -1,92 +0,0 @@
|
|||
|
||||
|
||||
from typing import List
|
||||
import FreeCAD as App
|
||||
import Part
|
||||
from model.join_mesh_model import JoinMeshModel
|
||||
from model.mesh_part_model import MeshPartModel
|
||||
from helper.fs import FS
|
||||
from helper.is_solid import is_object_solid
|
||||
from model.simple_copy_part_model import SimpleCopyPartModel
|
||||
from model.files_generator import FolderGenerator
|
||||
from usecases.assembly_parse_usecase import AssemblyParseUseCase
|
||||
import os
|
||||
import json
|
||||
|
||||
|
||||
class ExportAssemblyThemAllUseCase:
|
||||
|
||||
def call(self, path):
|
||||
assembly = AssemblyParseUseCase().getAsm()
|
||||
asmStructure = {}
|
||||
inc = 0
|
||||
for el in assembly:
|
||||
if (inc != 0):
|
||||
asmStructure[inc] = {
|
||||
"child": el,
|
||||
"parents": assembly[0:inc]
|
||||
}
|
||||
inc += 1
|
||||
objectsFreeCad = App.ActiveDocument.Objects
|
||||
asmSolids = {}
|
||||
for k, v in asmStructure.items():
|
||||
assemblyParentList = v['parents']
|
||||
assemblyChild = v['child']
|
||||
for el in assemblyParentList:
|
||||
for solid in objectsFreeCad:
|
||||
if (el == solid.Label):
|
||||
if (asmSolids.get(k) is None):
|
||||
|
||||
asmSolids[k] = {'parents': [], 'child': list(
|
||||
filter(lambda x: x.Label == assemblyChild, objectsFreeCad))[0]}
|
||||
|
||||
asmSolids[k]['parents'].append(solid)
|
||||
|
||||
inc = 0
|
||||
for k, v in asmSolids.items():
|
||||
geometry = {"0": [], "1": []}
|
||||
if (k != 0):
|
||||
App.activeDocument().addObject("Part::Compound", "Compound")
|
||||
|
||||
copyLinks = list(
|
||||
map(lambda el: SimpleCopyPartModel(el), v['parents']))
|
||||
|
||||
if copyLinks != None:
|
||||
App.activeDocument().Compound.Links = list(
|
||||
map(lambda el: el.getPart(), copyLinks))
|
||||
|
||||
object = App.activeDocument().getObject('Compound')
|
||||
boundBox = object.Shape.BoundBox
|
||||
geometry['0'].append(boundBox.XMax)
|
||||
geometry['0'].append(boundBox.YMax)
|
||||
geometry['0'].append(boundBox.ZMax)
|
||||
|
||||
os.makedirs(
|
||||
path + FolderGenerator.ASSEMBlY.value + '/' + '0000' + str(k))
|
||||
boundBoxChild = v['child'].Shape.BoundBox
|
||||
geometry['1'].append(boundBoxChild.XMax)
|
||||
geometry['1'].append(boundBoxChild.YMax)
|
||||
geometry['1'].append(boundBoxChild.ZMax)
|
||||
meshParents = []
|
||||
|
||||
for el in v['parents']:
|
||||
meshParents.append(MeshPartModel(el))
|
||||
joinMesh = JoinMeshModel(meshParents)
|
||||
for el in meshParents:
|
||||
el.remove()
|
||||
import importOBJ
|
||||
importOBJ.export(joinMesh.mesh, path + FolderGenerator.ASSEMBlY.value +
|
||||
'/' + '0000' + str(k) + '/' + str(1) + '.obj')
|
||||
joinMesh.remove()
|
||||
importOBJ.export(v['child'], path + FolderGenerator.ASSEMBlY.value +
|
||||
'/' + '0000' + str(k) + '/' + str(0) + '.obj')
|
||||
FS.writeFile(json.dumps(geometry), path + FolderGenerator.ASSEMBlY.value +
|
||||
'/' + '0000' + str(k) + '/', 'translation.json')
|
||||
|
||||
App.ActiveDocument.removeObject("Compound")
|
||||
for el in copyLinks:
|
||||
el.remove()
|
||||
App.activeDocument().recompute()
|
||||
inc += 1
|
||||
|
||||
|
|
@ -1,36 +0,0 @@
|
|||
# import importDAE
|
||||
import Mesh
|
||||
import FreeCAD as App
|
||||
from model.files_generator import FolderGenerator
|
||||
from helper.is_solid import is_object_solid
|
||||
from enum import Enum
|
||||
|
||||
class EXPORT_TYPES(Enum):
|
||||
STL = 'STL'
|
||||
DAO = 'DAO'
|
||||
OBJ = 'OBJ'
|
||||
|
||||
|
||||
class ExportUseCase:
|
||||
def call(path: str, type: EXPORT_TYPES):
|
||||
meshes = {}
|
||||
for el in App.ActiveDocument.Objects:
|
||||
if (is_object_solid(el)):
|
||||
match type.value:
|
||||
case EXPORT_TYPES.STL.value:
|
||||
Mesh.export([el], path + '/' + FolderGenerator.SDF.value +
|
||||
'/' + FolderGenerator.MESHES.value + '/' + el.Label + '.stl')
|
||||
meshes[el.Label] = '/' + FolderGenerator.MESHES.value + \
|
||||
'/' + el.Label + '.stl'
|
||||
|
||||
# case EXPORT_TYPES.DAO.value:
|
||||
# importDAE.export([el], path + '/' + FolderGenerator.SDF.value +
|
||||
# '/' + FolderGenerator.MESHES.value + '/' + el.Label + '.dae')
|
||||
case EXPORT_TYPES.OBJ.value:
|
||||
import importOBJ
|
||||
importOBJ.export([el], path + '/' + FolderGenerator.SDF.value +
|
||||
'/' + FolderGenerator.MESHES.value + '/' + el.Label + '.obj')
|
||||
meshes[el.Label] = '/' + FolderGenerator.MESHES.value + \
|
||||
'/' + el.Label + '.obj'
|
||||
print(300)
|
||||
return meshes
|
|
@ -1,58 +0,0 @@
|
|||
|
||||
import FreeCAD as App
|
||||
from helper.is_solid import is_object_solid
|
||||
|
||||
|
||||
class GeometryUseCase:
|
||||
def call() -> dict:
|
||||
labels = []
|
||||
Error = False
|
||||
for el in App.ActiveDocument.Objects:
|
||||
try:
|
||||
|
||||
if is_object_solid(el):
|
||||
labels.append(el.Label)
|
||||
|
||||
geometry = {
|
||||
"euler": {
|
||||
"x": None,
|
||||
"y": None,
|
||||
"z": None
|
||||
},
|
||||
"position": {
|
||||
"x": None,
|
||||
"y": None,
|
||||
"z": None
|
||||
},
|
||||
"rotation": {
|
||||
"x": None,
|
||||
"y": None,
|
||||
"z": None
|
||||
},
|
||||
"center": {
|
||||
"x": None,
|
||||
"y": None,
|
||||
"z": None
|
||||
},
|
||||
|
||||
}
|
||||
|
||||
boundBox = el.Shape.BoundBox
|
||||
geometry["center"]["x"] = boundBox.Center.x
|
||||
geometry["center"]["y"] = boundBox.Center.y
|
||||
geometry["center"]["z"] = boundBox.Center.z
|
||||
geometry["position"]['x'] = boundBox.XMax
|
||||
geometry["position"]['y'] = boundBox.YMax
|
||||
geometry["position"]['z'] = boundBox.ZMax
|
||||
rotation = el.Placement.Rotation
|
||||
geometry["rotation"]['x'] = rotation.Axis.z
|
||||
geometry["rotation"]['y'] = rotation.Axis.y
|
||||
geometry["rotation"]['z'] = rotation.Axis.z
|
||||
euler = el.Placement.Rotation.toEuler()
|
||||
geometry["euler"]['x'] = euler[0]
|
||||
geometry["euler"]['y'] = euler[1]
|
||||
geometry["euler"]['z'] = euler[2]
|
||||
except Exception as e:
|
||||
print(e)
|
||||
# App.Console.PrintMessage("Clicked on position: ("+str(pos[0])+", "+str(pos[1])+")\n")
|
||||
return {"geometry": geometry, "labels": labels, "label": el.Label}
|
|
@ -1,70 +0,0 @@
|
|||
import FreeCAD as App
|
||||
from model.sdf_geometry_model import SdfGeometryModel
|
||||
|
||||
from helper.is_solid import is_object_solid
|
||||
|
||||
|
||||
class SdfGeometryUseCase:
|
||||
ShapePropertyCheck = ['Mass', 'MatrixOfInertia', 'Placement', ]
|
||||
PartPropertyCheck = ['Shape']
|
||||
|
||||
def call(self, stlPaths: dict) -> list[SdfGeometryModel]:
|
||||
materialSolid = {}
|
||||
for el in App.ActiveDocument.Objects:
|
||||
if str(el) == '<App::MaterialObjectPython object>':
|
||||
friction = el.Material.get('SlidingFriction')
|
||||
for i in el.References:
|
||||
materialSolid[i[0].Label] = friction
|
||||
geometry = []
|
||||
try:
|
||||
for el in App.ActiveDocument.Objects:
|
||||
|
||||
if is_object_solid(el):
|
||||
mass = el.Shape.Mass
|
||||
inertia = el.Shape.MatrixOfInertia
|
||||
pos = el.Shape.Placement
|
||||
inertia = el.Shape.MatrixOfInertia
|
||||
name = el.Label
|
||||
delimiter = 1000000
|
||||
ixx = str(inertia.A11 / delimiter)
|
||||
ixy = str(inertia.A12 / delimiter)
|
||||
ixz = str(inertia.A13 / delimiter)
|
||||
iyy = str(inertia.A22 / delimiter)
|
||||
iyz = str(inertia.A23 / delimiter)
|
||||
izz = str(inertia.A33 / delimiter)
|
||||
massSDF = str(mass / delimiter)
|
||||
posX = str(pos.Base[0] / delimiter)
|
||||
posY = str(pos.Base[1] / delimiter)
|
||||
posZ = str(pos.Base[2] / delimiter)
|
||||
eulerX = str(pos.Rotation.toEuler()[0])
|
||||
eulerY = str(pos.Rotation.toEuler()[1])
|
||||
eulerZ = str(pos.Rotation.toEuler()[2])
|
||||
centerMassX = str(el.Shape.CenterOfMass[0])
|
||||
centerMassY = str(el.Shape.CenterOfMass[1])
|
||||
centerMassZ = str(el.Shape.CenterOfMass[2])
|
||||
geometry.append(
|
||||
SdfGeometryModel(
|
||||
stl=stlPaths.get(el.Label),
|
||||
name=name,
|
||||
ixx=ixx,
|
||||
ixz=ixz,
|
||||
ixy=ixy,
|
||||
iyy=iyy,
|
||||
iyz=iyz,
|
||||
izz=izz,
|
||||
massSDF=massSDF,
|
||||
posX=posX,
|
||||
posY=posY,
|
||||
posZ=posZ,
|
||||
eulerX=eulerX,
|
||||
eulerY=eulerY,
|
||||
eulerZ=eulerZ,
|
||||
friction=materialSolid.get(el.Label) or '',
|
||||
centerMassX=centerMassX,
|
||||
centerMassY=centerMassY,
|
||||
centerMassZ=centerMassZ
|
||||
)
|
||||
)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
return geometry
|
96
cad_stability_check/.gitignore
vendored
96
cad_stability_check/.gitignore
vendored
|
@ -1,96 +0,0 @@
|
|||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
||||
# Distribution / packaging
|
||||
.Python
|
||||
env/
|
||||
build/
|
||||
develop-eggs/
|
||||
dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
*.egg
|
||||
|
||||
# PyInstaller
|
||||
# Usually these files are written by a python script from a template
|
||||
# before PyInstaller builds the exe, so as to inject date/other infos into it.
|
||||
*.manifest
|
||||
*.spec
|
||||
|
||||
# Installer logs
|
||||
pip-log.txt
|
||||
pip-delete-this-directory.txt
|
||||
|
||||
# Unit test / coverage reports
|
||||
htmlcov/
|
||||
.tox/
|
||||
.coverage
|
||||
.coverage.*
|
||||
.cache
|
||||
nosetests.xml
|
||||
coverage.xml
|
||||
*,cover
|
||||
.hypothesis/
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
*.pot
|
||||
|
||||
# Django stuff:
|
||||
*.log
|
||||
local_settings.py
|
||||
|
||||
# Flask stuff:
|
||||
instance/
|
||||
.webassets-cache
|
||||
|
||||
# Scrapy stuff:
|
||||
.scrapy
|
||||
|
||||
# Sphinx documentation
|
||||
docs/_build/
|
||||
|
||||
# PyBuilder
|
||||
target/
|
||||
|
||||
# Jupyter Notebook
|
||||
.ipynb_checkpoints
|
||||
|
||||
# pyenv
|
||||
.python-version
|
||||
|
||||
# celery beat schedule file
|
||||
celerybeat-schedule
|
||||
|
||||
# SageMath parsed files
|
||||
*.sage.py
|
||||
|
||||
# dotenv
|
||||
.env
|
||||
|
||||
# virtualenv
|
||||
.venv
|
||||
venv/
|
||||
ENV/
|
||||
|
||||
# Spyder project settings
|
||||
.spyderproject
|
||||
|
||||
# Rope project settings
|
||||
.ropeproject
|
||||
env.json
|
||||
out/
|
|
@ -1,405 +0,0 @@
|
|||
# Алгоритм генерации графа с помощью оценки стабильности подсборок в физическом движке PyBullet
|
||||
from typing import Any, TypeVar, Type, cast
|
||||
import FreeCAD as App
|
||||
import json
|
||||
import importOBJ
|
||||
import FreeCAD as App
|
||||
import Draft
|
||||
import os
|
||||
import Part
|
||||
import numpy as np
|
||||
from typing import TypeAlias
|
||||
import FreeCADGui as Gui
|
||||
|
||||
ISequencesUnion: TypeAlias = dict[str:dict[str:list[str]]]
|
||||
|
||||
|
||||
def importObjAtPath(path: str, inc: int):
|
||||
importOBJ.insert(u"" + path, App.ActiveDocument.Label)
|
||||
|
||||
mesh = App.ActiveDocument.Objects[inc]
|
||||
shape = Part.Shape()
|
||||
shape.makeShapeFromMesh(mesh.Mesh.Topology, 0.05)
|
||||
solid = Part.makeSolid(shape)
|
||||
Part.show(solid)
|
||||
App.ActiveDocument.Objects[inc +
|
||||
1].Label = App.ActiveDocument.Objects[inc].Name
|
||||
App.ActiveDocument.removeObject(App.ActiveDocument.Objects[inc].Name)
|
||||
return App.ActiveDocument.Objects[inc]
|
||||
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
def from_float(x: Any) -> float:
|
||||
assert isinstance(x, (float, int)) and not isinstance(x, bool)
|
||||
return float(x)
|
||||
|
||||
|
||||
def to_float(x: Any) -> float:
|
||||
assert isinstance(x, float)
|
||||
return x
|
||||
|
||||
|
||||
def from_str(x: Any) -> str:
|
||||
assert isinstance(x, str)
|
||||
return x
|
||||
|
||||
|
||||
def to_class(c: Type[T], x: Any) -> dict:
|
||||
assert isinstance(x, c)
|
||||
return cast(Any, x).to_dict()
|
||||
|
||||
|
||||
def euler_to_quaternion(yaw, pitch, roll):
|
||||
|
||||
qx = np.sin(roll/2) * np.cos(pitch/2) * np.cos(yaw/2) - \
|
||||
np.cos(roll/2) * np.sin(pitch/2) * np.sin(yaw/2)
|
||||
qy = np.cos(roll/2) * np.sin(pitch/2) * np.cos(yaw/2) + \
|
||||
np.sin(roll/2) * np.cos(pitch/2) * np.sin(yaw/2)
|
||||
qz = np.cos(roll/2) * np.cos(pitch/2) * np.sin(yaw/2) - \
|
||||
np.sin(roll/2) * np.sin(pitch/2) * np.cos(yaw/2)
|
||||
qw = np.cos(roll/2) * np.cos(pitch/2) * np.cos(yaw/2) + \
|
||||
np.sin(roll/2) * np.sin(pitch/2) * np.sin(yaw/2)
|
||||
|
||||
return [qx, qy, qz, qw]
|
||||
|
||||
|
||||
class Coords:
|
||||
x: float
|
||||
y: float
|
||||
z: float
|
||||
|
||||
def __init__(self, x: float, y: float, z: float) -> None:
|
||||
self.x = x
|
||||
self.y = y
|
||||
self.z = z
|
||||
|
||||
@staticmethod
|
||||
def from_dict(obj: Any) -> 'Coords':
|
||||
assert isinstance(obj, dict)
|
||||
x = from_float(obj.get("x"))
|
||||
y = from_float(obj.get("y"))
|
||||
z = from_float(obj.get("z"))
|
||||
return Coords(x, y, z)
|
||||
|
||||
def to_dict(self) -> dict:
|
||||
result: dict = {}
|
||||
result["x"] = to_float(self.x)
|
||||
result["y"] = to_float(self.y)
|
||||
result["z"] = to_float(self.z)
|
||||
return result
|
||||
|
||||
def vector(self):
|
||||
return App.Vector(self.x, self.y, self.z)
|
||||
|
||||
|
||||
class MotionResultModel:
|
||||
id: str
|
||||
euler: Coords
|
||||
position: Coords
|
||||
|
||||
def __init__(self, id: str, euler: Coords, position: Coords) -> None:
|
||||
self.id = id
|
||||
self.euler = euler
|
||||
self.position = position
|
||||
|
||||
@staticmethod
|
||||
def from_dict(obj: Any) -> 'MotionResultModel':
|
||||
assert isinstance(obj, dict)
|
||||
id = from_str(obj.get("id"))
|
||||
euler = Coords.from_dict(obj.get("euler"))
|
||||
position = Coords.from_dict(obj.get("position"))
|
||||
return MotionResultModel(id, euler, position)
|
||||
|
||||
def to_dict(self) -> dict:
|
||||
result: dict = {}
|
||||
result["id"] = from_str(self.id)
|
||||
result["euler"] = to_class(Coords, self.euler)
|
||||
result["position"] = to_class(Coords, self.position)
|
||||
return result
|
||||
|
||||
|
||||
class SequencesEvaluation:
|
||||
sequences: ISequencesUnion
|
||||
assemblyDir: str
|
||||
result: dict = {}
|
||||
|
||||
def __init__(self, sequences, assemblyDir) -> None:
|
||||
self.sequences = sequences
|
||||
self.assemblyDir = assemblyDir
|
||||
pass
|
||||
|
||||
def assemblyComputed(self):
|
||||
debug = True
|
||||
for sequenceNumber, v in self.sequences.items():
|
||||
for assemblyNumber, assemblySequenced in v.items():
|
||||
# print(assemblyNumber)
|
||||
# print()
|
||||
# if(assemblyNumber == 1 and sequenceNumber == 4 or assemblyNumber == 1 and sequenceNumber == 0):
|
||||
if(assemblyNumber == 1 and sequenceNumber == 1 and debug):
|
||||
debug = False
|
||||
if(sequenceNumber == 0):
|
||||
sequenceNumber+=1
|
||||
print(assemblySequenced)
|
||||
self.comptedAssembly(
|
||||
assemblySequenced, sequenceNumber, assemblyNumber)
|
||||
|
||||
pass
|
||||
|
||||
def comptedAssembly(self, assembly: list[str], sequenceNumber: int, assemblyNumber: int):
|
||||
assemblyParts = []
|
||||
for counter in range(len(assembly)):
|
||||
importObjAtPath(
|
||||
self.assemblyDir + 'sdf/meshes/' + assembly[counter] + '.obj',
|
||||
counter
|
||||
)
|
||||
assemblyParts.append({
|
||||
"part": App.ActiveDocument.Objects[counter],
|
||||
"name": assembly[counter]
|
||||
})
|
||||
|
||||
motionResult = json.loads((open(self.assemblyDir + 'stability' + '/' + str(
|
||||
sequenceNumber + 1) + '/' + str(assemblyNumber) + '/' + 'motion_result.json')).read())
|
||||
|
||||
simulatorMotionResults: list['MotionResultModel'] = []
|
||||
for _k, v in motionResult.items():
|
||||
simulatorMotionResults.append(MotionResultModel.from_dict(v))
|
||||
|
||||
for el in simulatorMotionResults:
|
||||
for e in assemblyParts:
|
||||
# сопоставляем детали
|
||||
if (el.id == e.get('name')):
|
||||
# вычисляем центр детали для перемещения
|
||||
center = e.get('part').Shape.CenterOfMass
|
||||
# получаем центр деталей из симуляции
|
||||
new_center = App.Vector(el.position.vector())
|
||||
|
||||
# вычисляем вектор смещения
|
||||
offset = new_center - center
|
||||
# перемещаем деталь на вектор смещения
|
||||
e.get('part').Placement.Base += offset
|
||||
|
||||
# импортируем меш связанный с зоной обьекта
|
||||
# zonePart = importObjAtPath(self.assemblyDir + "stability/zones/meshes/zone_sub_assembly" + str(
|
||||
# assembly.__len__() - 1) + ".obj", len(App.ActiveDocument.Objects))
|
||||
# получаем координаты зоны относительно детали за которой закреплена зона
|
||||
print(assemblyNumber)
|
||||
coords = json.loads(
|
||||
(open(self.assemblyDir + "stability/zones/sub_assembly_coords_" + str(assemblyNumber - 1) + ".json")).read())
|
||||
assemblyCounter = len(assemblyParts)
|
||||
detailWhichZoneBindings = assemblyParts[assemblyCounter - 2].get(
|
||||
'part')
|
||||
|
||||
detailStabilityComputed = assemblyParts[assemblyCounter - 1].get(
|
||||
'part')
|
||||
relativeCoordinates = coords.get('relativeCoordinates')
|
||||
|
||||
relativeEuler = coords.get('relativeEuler')
|
||||
|
||||
specificEuler = {
|
||||
'yaw': detailWhichZoneBindings.Placement.Rotation.toEuler()[0] - relativeEuler.get('yaw'),
|
||||
'pitch': detailWhichZoneBindings.Placement.Rotation.toEuler()[1] - relativeEuler.get('pitch'),
|
||||
'roll': detailWhichZoneBindings.Placement.Rotation.toEuler()[2] - relativeEuler.get('roll')
|
||||
}
|
||||
quaternion = euler_to_quaternion(specificEuler.get(
|
||||
'yaw'), specificEuler.get('pitch'), specificEuler.get('roll'))
|
||||
rotation = App.Rotation(
|
||||
quaternion[0], quaternion[1], quaternion[2], quaternion[3])
|
||||
|
||||
detailStabilityComputed.Placement.Rotation = rotation
|
||||
centerVector = detailWhichZoneBindings.Shape.CenterOfMass
|
||||
vector = App.Vector(relativeCoordinates.get(
|
||||
'x'), relativeCoordinates.get('y'), relativeCoordinates.get('z'))
|
||||
|
||||
# current_center = zonePart.Shape.CenterOfMass
|
||||
# move_vector = App.Vector(centerVector + vector) - current_center
|
||||
# zonePart.Placement.move(move_vector)
|
||||
|
||||
# computedStabilityResult = computedStability(
|
||||
# zonePart, detailStabilityComputed)
|
||||
|
||||
if sequenceNumber not in self.result.keys():
|
||||
self.result[sequenceNumber] = []
|
||||
|
||||
# self.result[sequenceNumber].append({
|
||||
# str(assemblyNumber): assembly,
|
||||
# "result": computedStabilityResult
|
||||
# })
|
||||
|
||||
# for part in App.ActiveDocument.Objects:
|
||||
# App.ActiveDocument.removeObject(part.Name)
|
||||
|
||||
|
||||
def get_part_center(part):
|
||||
shape = None
|
||||
if not hasattr(part, 'Shape'):
|
||||
shape = part.Mesh
|
||||
if hasattr(part, 'Shape'):
|
||||
shape = part.Shape
|
||||
|
||||
center = shape.BoundBox.Center
|
||||
return App.Vector(center[0],
|
||||
center[1],
|
||||
center[2])
|
||||
|
||||
|
||||
def move_second_part_to_match_center(first_part, second_part):
|
||||
first_center = get_part_center(first_part)
|
||||
second_center = get_part_center(second_part)
|
||||
offset = first_center - second_center
|
||||
second_part.Placement.move(offset)
|
||||
|
||||
|
||||
def create(part):
|
||||
clone = Draft.make_clone([part], forcedraft=True)
|
||||
|
||||
clone.Scale = App.Vector(1.30, 1.30, 1.30)
|
||||
clone_corr = (App.Vector(0.4476673941774023, -2.109332894191716, -0.5918687740295264) -
|
||||
clone.Placement.Base).scale(*App.Vector(-0.25, -0.25, -0.25))
|
||||
clone.Placement.move(clone_corr)
|
||||
App.ActiveDocument.recompute()
|
||||
|
||||
return clone
|
||||
|
||||
|
||||
def getFullPathObj(assemblyFolder: str, name: str):
|
||||
return assemblyFolder + 'sdf/meshes/' + name + '.obj'
|
||||
|
||||
|
||||
def computedStability(refElement, childElement):
|
||||
rootElement = childElement.Shape.BoundBox
|
||||
# Создание обьекта на котором делается операция пересечения
|
||||
App.activeDocument().addObject("Part::MultiCommon", "Common")
|
||||
App.activeDocument().Common.Shapes = [refElement, childElement, ]
|
||||
App.ActiveDocument.getObject('Common').ViewObject.ShapeColor = getattr(App.ActiveDocument.getObject(
|
||||
refElement.Name).getLinkedObject(True).ViewObject, 'ShapeColor', App.ActiveDocument.getObject('Common').ViewObject.ShapeColor)
|
||||
App.ActiveDocument.getObject('Common').ViewObject.DisplayMode = getattr(App.ActiveDocument.getObject(
|
||||
childElement.Name).getLinkedObject(True).ViewObject, 'DisplayMode', App.ActiveDocument.getObject('Common').ViewObject.DisplayMode)
|
||||
App.ActiveDocument.recompute()
|
||||
obj = App.ActiveDocument.getObjectsByLabel('Common')[0]
|
||||
|
||||
shp = obj.Shape
|
||||
bbox = shp.BoundBox
|
||||
# Если после операции пересечения зона обьекта совпадает с зоной тестируемого обьекта то тест прошел успешно
|
||||
if bbox.XLength == rootElement.XLength and bbox.YLength == rootElement.YLength and rootElement.ZLength == bbox.ZLength:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def autoStabilityZoneComputed(stepFilesPaths: list[str], directoryStableZonesPath: str):
|
||||
|
||||
cadObjects = []
|
||||
|
||||
for count in range(len(stepFilesPaths)):
|
||||
|
||||
importObjAtPath(stepFilesPaths[count], count)
|
||||
cadObjects.append(App.ActiveDocument.Objects[count])
|
||||
|
||||
assemblesBindings = []
|
||||
|
||||
for increment in range(len(cadObjects)):
|
||||
if (increment != 0):
|
||||
detailForEvaluationZ = cadObjects[increment]
|
||||
zoneBindingDetailZ = cadObjects[increment-1]
|
||||
assemblesBindings.append(
|
||||
{'zoneBindingDetail': detailForEvaluationZ, 'detailForEvaluation': zoneBindingDetailZ, 'relativeCoordinates': None, 'zonePart': None})
|
||||
for increment in range(len(assemblesBindings)):
|
||||
|
||||
el = assemblesBindings[increment]
|
||||
|
||||
zoneBindingDetail = el.get('zoneBindingDetail')
|
||||
zoneBindingDetailCenterVector = zoneBindingDetail.Shape.CenterOfMass
|
||||
zoneDetail = create(el.get('detailForEvaluation'))
|
||||
|
||||
move_second_part_to_match_center(
|
||||
el.get('zoneBindingDetail'), zoneDetail)
|
||||
zoneDetail.Label = 'zone_sub_assembly' + str(increment + 1)
|
||||
|
||||
zoneDetail.ViewObject.ShapeColor = (0.40, 0.74, 0.71)
|
||||
zoneDetail.ViewObject.Transparency = 50
|
||||
zoneDetailCenterVector = el.get(
|
||||
'detailForEvaluation').Shape.CenterOfMass
|
||||
|
||||
el['relativeCoordinates'] = {
|
||||
'x': zoneBindingDetailCenterVector.x - zoneDetailCenterVector.x,
|
||||
'y': zoneBindingDetailCenterVector.y - zoneDetailCenterVector.y,
|
||||
'z': zoneBindingDetailCenterVector.z - zoneDetailCenterVector.z
|
||||
}
|
||||
|
||||
el['relativeEuler'] = {
|
||||
'yaw': zoneBindingDetail.Placement.Rotation.toEuler()[0] - el.get('detailForEvaluation').Placement.Rotation.toEuler()[0],
|
||||
'pitch': zoneBindingDetail.Placement.Rotation.toEuler()[1] - el.get('detailForEvaluation').Placement.Rotation.toEuler()[1],
|
||||
'roll': zoneBindingDetail.Placement.Rotation.toEuler()[2] - el.get('detailForEvaluation').Placement.Rotation.toEuler()[2]
|
||||
}
|
||||
el['zonePart'] = zoneDetail
|
||||
|
||||
meshesPath = directoryStableZonesPath + 'meshes/'
|
||||
if not os.path.exists(directoryStableZonesPath):
|
||||
os.makedirs(directoryStableZonesPath)
|
||||
if not os.path.exists(meshesPath):
|
||||
os.makedirs(meshesPath)
|
||||
zonesSaved = {}
|
||||
for counter in range(len(assemblesBindings)):
|
||||
zoneComputed = assemblesBindings[counter]
|
||||
mesh = zoneComputed.get('zonePart')
|
||||
zonesSavePath = meshesPath + mesh.Label + '.obj'
|
||||
importOBJ.export([mesh], zonesSavePath)
|
||||
zonesSaved[mesh.Label] = 'meshes/' + mesh.Label + '.obj'
|
||||
for counter in range(len(assemblesBindings)):
|
||||
el = assemblesBindings[counter]
|
||||
savePath = zonesSaved[el.get('zonePart').Label]
|
||||
el['zonePart'] = savePath
|
||||
el['detailForEvaluation'] = el['detailForEvaluation'].Label
|
||||
el['zoneBindingDetail'] = el['zoneBindingDetail'].Label
|
||||
|
||||
json_result = json.dumps(el)
|
||||
|
||||
file_to_open = directoryStableZonesPath + \
|
||||
'sub_assembly_coords_' + str(counter) + '.json'
|
||||
|
||||
f = open(file_to_open, 'w', )
|
||||
|
||||
f.write(json_result)
|
||||
f.close()
|
||||
|
||||
|
||||
def main():
|
||||
App.newDocument()
|
||||
env = json.loads((open('./env.json')).read())
|
||||
|
||||
assemblyDir = env.get('aspPath')
|
||||
sequencesJSON = json.loads((open(assemblyDir + 'sequences.json')).read())
|
||||
directoryStableZones = assemblyDir + 'stability/zones/'
|
||||
|
||||
sequences = sequencesJSON.get('sequences')
|
||||
stepStructure = json.loads(
|
||||
(open(assemblyDir + 'step-structure.json')).read())
|
||||
|
||||
stepFilesPaths = []
|
||||
for step in stepStructure:
|
||||
stepFilesPaths.append(assemblyDir+'sdf/meshes/' + step + '.obj')
|
||||
if not os.path.exists(directoryStableZones):
|
||||
print('Zones not found automatic calculation started')
|
||||
autoStabilityZoneComputed(stepFilesPaths, directoryStableZones)
|
||||
|
||||
sequencesJoin = {}
|
||||
|
||||
for arrayCounter in range(len(sequences)):
|
||||
for indexCounter in range(len(sequences[arrayCounter])):
|
||||
if (indexCounter != 0):
|
||||
if (sequencesJoin.get(arrayCounter) == None):
|
||||
sequencesJoin[arrayCounter] = {
|
||||
indexCounter: sequences[arrayCounter][0:indexCounter+1]
|
||||
}
|
||||
else:
|
||||
sequencesJoin[arrayCounter][indexCounter] = sequences[arrayCounter][0:indexCounter+1]
|
||||
|
||||
|
||||
|
||||
seqEvaluation = SequencesEvaluation(sequencesJoin, assemblyDir)
|
||||
seqEvaluation.assemblyComputed()
|
||||
print(seqEvaluation.result)
|
||||
|
||||
|
||||
main()
|
|
@ -1,505 +0,0 @@
|
|||
GNU LESSER GENERAL PUBLIC LICENSE
|
||||
Version 2.1, February 1999
|
||||
|
||||
Copyright (C) 1991, 1999 Free Software Foundation, Inc.
|
||||
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
(This is the first released version of the Lesser GPL. It also counts
|
||||
as the successor of the GNU Library Public License, version 2, hence
|
||||
the version number 2.1.)
|
||||
|
||||
Preamble
|
||||
|
||||
The licenses for most software are designed to take away your
|
||||
freedom to share and change it. By contrast, the GNU General Public
|
||||
Licenses are intended to guarantee your freedom to share and change
|
||||
free software--to make sure the software is free for all its users.
|
||||
|
||||
This license, the Lesser General Public License, applies to some
|
||||
specially designated software packages--typically libraries--of the
|
||||
Free Software Foundation and other authors who decide to use it. You
|
||||
can use it too, but we suggest you first think carefully about whether
|
||||
this license or the ordinary General Public License is the better
|
||||
strategy to use in any particular case, based on the explanations below.
|
||||
|
||||
When we speak of free software, we are referring to freedom of use,
|
||||
not price. Our General Public Licenses are designed to make sure that
|
||||
you have the freedom to distribute copies of free software (and charge
|
||||
for this service if you wish); that you receive source code or can get
|
||||
it if you want it; that you can change the software and use pieces of
|
||||
it in new free programs; and that you are informed that you can do
|
||||
these things.
|
||||
|
||||
To protect your rights, we need to make restrictions that forbid
|
||||
distributors to deny you these rights or to ask you to surrender these
|
||||
rights. These restrictions translate to certain responsibilities for
|
||||
you if you distribute copies of the library or if you modify it.
|
||||
|
||||
For example, if you distribute copies of the library, whether gratis
|
||||
or for a fee, you must give the recipients all the rights that we gave
|
||||
you. You must make sure that they, too, receive or can get the source
|
||||
code. If you link other code with the library, you must provide
|
||||
complete object files to the recipients, so that they can relink them
|
||||
with the library after making changes to the library and recompiling
|
||||
it. And you must show them these terms so they know their rights.
|
||||
|
||||
We protect your rights with a two-step method: (1) we copyright the
|
||||
library, and (2) we offer you this license, which gives you legal
|
||||
permission to copy, distribute and/or modify the library.
|
||||
|
||||
To protect each distributor, we want to make it very clear that
|
||||
there is no warranty for the free library. Also, if the library is
|
||||
modified by someone else and passed on, the recipients should know
|
||||
that what they have is not the original version, so that the original
|
||||
author's reputation will not be affected by problems that might be
|
||||
introduced by others.
|
||||
|
||||
Finally, software patents pose a constant threat to the existence of
|
||||
any free program. We wish to make sure that a company cannot
|
||||
effectively restrict the users of a free program by obtaining a
|
||||
restrictive license from a patent holder. Therefore, we insist that
|
||||
any patent license obtained for a version of the library must be
|
||||
consistent with the full freedom of use specified in this license.
|
||||
|
||||
Most GNU software, including some libraries, is covered by the
|
||||
ordinary GNU General Public License. This license, the GNU Lesser
|
||||
General Public License, applies to certain designated libraries, and
|
||||
is quite different from the ordinary General Public License. We use
|
||||
this license for certain libraries in order to permit linking those
|
||||
libraries into non-free programs.
|
||||
|
||||
When a program is linked with a library, whether statically or using
|
||||
a shared library, the combination of the two is legally speaking a
|
||||
combined work, a derivative of the original library. The ordinary
|
||||
General Public License therefore permits such linking only if the
|
||||
entire combination fits its criteria of freedom. The Lesser General
|
||||
Public License permits more lax criteria for linking other code with
|
||||
the library.
|
||||
|
||||
We call this license the "Lesser" General Public License because it
|
||||
does Less to protect the user's freedom than the ordinary General
|
||||
Public License. It also provides other free software developers Less
|
||||
of an advantage over competing non-free programs. These disadvantages
|
||||
are the reason we use the ordinary General Public License for many
|
||||
libraries. However, the Lesser license provides advantages in certain
|
||||
special circumstances.
|
||||
|
||||
For example, on rare occasions, there may be a special need to
|
||||
encourage the widest possible use of a certain library, so that it becomes
|
||||
a de-facto standard. To achieve this, non-free programs must be
|
||||
allowed to use the library. A more frequent case is that a free
|
||||
library does the same job as widely used non-free libraries. In this
|
||||
case, there is little to gain by limiting the free library to free
|
||||
software only, so we use the Lesser General Public License.
|
||||
|
||||
In other cases, permission to use a particular library in non-free
|
||||
programs enables a greater number of people to use a large body of
|
||||
free software. For example, permission to use the GNU C Library in
|
||||
non-free programs enables many more people to use the whole GNU
|
||||
operating system, as well as its variant, the GNU/Linux operating
|
||||
system.
|
||||
|
||||
Although the Lesser General Public License is Less protective of the
|
||||
users' freedom, it does ensure that the user of a program that is
|
||||
linked with the Library has the freedom and the wherewithal to run
|
||||
that program using a modified version of the Library.
|
||||
|
||||
The precise terms and conditions for copying, distribution and
|
||||
modification follow. Pay close attention to the difference between a
|
||||
"work based on the library" and a "work that uses the library". The
|
||||
former contains code derived from the library, whereas the latter must
|
||||
be combined with the library in order to run.
|
||||
|
||||
GNU LESSER GENERAL PUBLIC LICENSE
|
||||
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
||||
|
||||
0. This License Agreement applies to any software library or other
|
||||
program which contains a notice placed by the copyright holder or
|
||||
other authorized party saying it may be distributed under the terms of
|
||||
this Lesser General Public License (also called "this License").
|
||||
Each licensee is addressed as "you".
|
||||
|
||||
A "library" means a collection of software functions and/or data
|
||||
prepared so as to be conveniently linked with application programs
|
||||
(which use some of those functions and data) to form executables.
|
||||
|
||||
The "Library", below, refers to any such software library or work
|
||||
which has been distributed under these terms. A "work based on the
|
||||
Library" means either the Library or any derivative work under
|
||||
copyright law: that is to say, a work containing the Library or a
|
||||
portion of it, either verbatim or with modifications and/or translated
|
||||
straightforwardly into another language. (Hereinafter, translation is
|
||||
included without limitation in the term "modification".)
|
||||
|
||||
"Source code" for a work means the preferred form of the work for
|
||||
making modifications to it. For a library, complete source code means
|
||||
all the source code for all modules it contains, plus any associated
|
||||
interface definition files, plus the scripts used to control compilation
|
||||
and installation of the library.
|
||||
|
||||
Activities other than copying, distribution and modification are not
|
||||
covered by this License; they are outside its scope. The act of
|
||||
running a program using the Library is not restricted, and output from
|
||||
such a program is covered only if its contents constitute a work based
|
||||
on the Library (independent of the use of the Library in a tool for
|
||||
writing it). Whether that is true depends on what the Library does
|
||||
and what the program that uses the Library does.
|
||||
|
||||
1. You may copy and distribute verbatim copies of the Library's
|
||||
complete source code as you receive it, in any medium, provided that
|
||||
you conspicuously and appropriately publish on each copy an
|
||||
appropriate copyright notice and disclaimer of warranty; keep intact
|
||||
all the notices that refer to this License and to the absence of any
|
||||
warranty; and distribute a copy of this License along with the
|
||||
Library.
|
||||
|
||||
You may charge a fee for the physical act of transferring a copy,
|
||||
and you may at your option offer warranty protection in exchange for a
|
||||
fee.
|
||||
|
||||
2. You may modify your copy or copies of the Library or any portion
|
||||
of it, thus forming a work based on the Library, and copy and
|
||||
distribute such modifications or work under the terms of Section 1
|
||||
above, provided that you also meet all of these conditions:
|
||||
|
||||
a) The modified work must itself be a software library.
|
||||
|
||||
b) You must cause the files modified to carry prominent notices
|
||||
stating that you changed the files and the date of any change.
|
||||
|
||||
c) You must cause the whole of the work to be licensed at no
|
||||
charge to all third parties under the terms of this License.
|
||||
|
||||
d) If a facility in the modified Library refers to a function or a
|
||||
table of data to be supplied by an application program that uses
|
||||
the facility, other than as an argument passed when the facility
|
||||
is invoked, then you must make a good faith effort to ensure that,
|
||||
in the event an application does not supply such function or
|
||||
table, the facility still operates, and performs whatever part of
|
||||
its purpose remains meaningful.
|
||||
|
||||
(For example, a function in a library to compute square roots has
|
||||
a purpose that is entirely well-defined independent of the
|
||||
application. Therefore, Subsection 2d requires that any
|
||||
application-supplied function or table used by this function must
|
||||
be optional: if the application does not supply it, the square
|
||||
root function must still compute square roots.)
|
||||
|
||||
These requirements apply to the modified work as a whole. If
|
||||
identifiable sections of that work are not derived from the Library,
|
||||
and can be reasonably considered independent and separate works in
|
||||
themselves, then this License, and its terms, do not apply to those
|
||||
sections when you distribute them as separate works. But when you
|
||||
distribute the same sections as part of a whole which is a work based
|
||||
on the Library, the distribution of the whole must be on the terms of
|
||||
this License, whose permissions for other licensees extend to the
|
||||
entire whole, and thus to each and every part regardless of who wrote
|
||||
it.
|
||||
|
||||
Thus, it is not the intent of this section to claim rights or contest
|
||||
your rights to work written entirely by you; rather, the intent is to
|
||||
exercise the right to control the distribution of derivative or
|
||||
collective works based on the Library.
|
||||
|
||||
In addition, mere aggregation of another work not based on the Library
|
||||
with the Library (or with a work based on the Library) on a volume of
|
||||
a storage or distribution medium does not bring the other work under
|
||||
the scope of this License.
|
||||
|
||||
3. You may opt to apply the terms of the ordinary GNU General Public
|
||||
License instead of this License to a given copy of the Library. To do
|
||||
this, you must alter all the notices that refer to this License, so
|
||||
that they refer to the ordinary GNU General Public License, version 2,
|
||||
instead of to this License. (If a newer version than version 2 of the
|
||||
ordinary GNU General Public License has appeared, then you can specify
|
||||
that version instead if you wish.) Do not make any other change in
|
||||
these notices.
|
||||
|
||||
Once this change is made in a given copy, it is irreversible for
|
||||
that copy, so the ordinary GNU General Public License applies to all
|
||||
subsequent copies and derivative works made from that copy.
|
||||
|
||||
This option is useful when you wish to copy part of the code of
|
||||
the Library into a program that is not a library.
|
||||
|
||||
4. You may copy and distribute the Library (or a portion or
|
||||
derivative of it, under Section 2) in object code or executable form
|
||||
under the terms of Sections 1 and 2 above provided that you accompany
|
||||
it with the complete corresponding machine-readable source code, which
|
||||
must be distributed under the terms of Sections 1 and 2 above on a
|
||||
medium customarily used for software interchange.
|
||||
|
||||
If distribution of object code is made by offering access to copy
|
||||
from a designated place, then offering equivalent access to copy the
|
||||
source code from the same place satisfies the requirement to
|
||||
distribute the source code, even though third parties are not
|
||||
compelled to copy the source along with the object code.
|
||||
|
||||
5. A program that contains no derivative of any portion of the
|
||||
Library, but is designed to work with the Library by being compiled or
|
||||
linked with it, is called a "work that uses the Library". Such a
|
||||
work, in isolation, is not a derivative work of the Library, and
|
||||
therefore falls outside the scope of this License.
|
||||
|
||||
However, linking a "work that uses the Library" with the Library
|
||||
creates an executable that is a derivative of the Library (because it
|
||||
contains portions of the Library), rather than a "work that uses the
|
||||
library". The executable is therefore covered by this License.
|
||||
Section 6 states terms for distribution of such executables.
|
||||
|
||||
When a "work that uses the Library" uses material from a header file
|
||||
that is part of the Library, the object code for the work may be a
|
||||
derivative work of the Library even though the source code is not.
|
||||
Whether this is true is especially significant if the work can be
|
||||
linked without the Library, or if the work is itself a library. The
|
||||
threshold for this to be true is not precisely defined by law.
|
||||
|
||||
If such an object file uses only numerical parameters, data
|
||||
structure layouts and accessors, and small macros and small inline
|
||||
functions (ten lines or less in length), then the use of the object
|
||||
file is unrestricted, regardless of whether it is legally a derivative
|
||||
work. (Executables containing this object code plus portions of the
|
||||
Library will still fall under Section 6.)
|
||||
|
||||
Otherwise, if the work is a derivative of the Library, you may
|
||||
distribute the object code for the work under the terms of Section 6.
|
||||
Any executables containing that work also fall under Section 6,
|
||||
whether or not they are linked directly with the Library itself.
|
||||
|
||||
6. As an exception to the Sections above, you may also combine or
|
||||
link a "work that uses the Library" with the Library to produce a
|
||||
work containing portions of the Library, and distribute that work
|
||||
under terms of your choice, provided that the terms permit
|
||||
modification of the work for the customer's own use and reverse
|
||||
engineering for debugging such modifications.
|
||||
|
||||
You must give prominent notice with each copy of the work that the
|
||||
Library is used in it and that the Library and its use are covered by
|
||||
this License. You must supply a copy of this License. If the work
|
||||
during execution displays copyright notices, you must include the
|
||||
copyright notice for the Library among them, as well as a reference
|
||||
directing the user to the copy of this License. Also, you must do one
|
||||
of these things:
|
||||
|
||||
a) Accompany the work with the complete corresponding
|
||||
machine-readable source code for the Library including whatever
|
||||
changes were used in the work (which must be distributed under
|
||||
Sections 1 and 2 above); and, if the work is an executable linked
|
||||
with the Library, with the complete machine-readable "work that
|
||||
uses the Library", as object code and/or source code, so that the
|
||||
user can modify the Library and then relink to produce a modified
|
||||
executable containing the modified Library. (It is understood
|
||||
that the user who changes the contents of definitions files in the
|
||||
Library will not necessarily be able to recompile the application
|
||||
to use the modified definitions.)
|
||||
|
||||
b) Use a suitable shared library mechanism for linking with the
|
||||
Library. A suitable mechanism is one that (1) uses at run time a
|
||||
copy of the library already present on the user's computer system,
|
||||
rather than copying library functions into the executable, and (2)
|
||||
will operate properly with a modified version of the library, if
|
||||
the user installs one, as long as the modified version is
|
||||
interface-compatible with the version that the work was made with.
|
||||
|
||||
c) Accompany the work with a written offer, valid for at
|
||||
least three years, to give the same user the materials
|
||||
specified in Subsection 6a, above, for a charge no more
|
||||
than the cost of performing this distribution.
|
||||
|
||||
d) If distribution of the work is made by offering access to copy
|
||||
from a designated place, offer equivalent access to copy the above
|
||||
specified materials from the same place.
|
||||
|
||||
e) Verify that the user has already received a copy of these
|
||||
materials or that you have already sent this user a copy.
|
||||
|
||||
For an executable, the required form of the "work that uses the
|
||||
Library" must include any data and utility programs needed for
|
||||
reproducing the executable from it. However, as a special exception,
|
||||
the materials to be distributed need not include anything that is
|
||||
normally distributed (in either source or binary form) with the major
|
||||
components (compiler, kernel, and so on) of the operating system on
|
||||
which the executable runs, unless that component itself accompanies
|
||||
the executable.
|
||||
|
||||
It may happen that this requirement contradicts the license
|
||||
restrictions of other proprietary libraries that do not normally
|
||||
accompany the operating system. Such a contradiction means you cannot
|
||||
use both them and the Library together in an executable that you
|
||||
distribute.
|
||||
|
||||
7. You may place library facilities that are a work based on the
|
||||
Library side-by-side in a single library together with other library
|
||||
facilities not covered by this License, and distribute such a combined
|
||||
library, provided that the separate distribution of the work based on
|
||||
the Library and of the other library facilities is otherwise
|
||||
permitted, and provided that you do these two things:
|
||||
|
||||
a) Accompany the combined library with a copy of the same work
|
||||
based on the Library, uncombined with any other library
|
||||
facilities. This must be distributed under the terms of the
|
||||
Sections above.
|
||||
|
||||
b) Give prominent notice with the combined library of the fact
|
||||
that part of it is a work based on the Library, and explaining
|
||||
where to find the accompanying uncombined form of the same work.
|
||||
|
||||
8. You may not copy, modify, sublicense, link with, or distribute
|
||||
the Library except as expressly provided under this License. Any
|
||||
attempt otherwise to copy, modify, sublicense, link with, or
|
||||
distribute the Library is void, and will automatically terminate your
|
||||
rights under this License. However, parties who have received copies,
|
||||
or rights, from you under this License will not have their licenses
|
||||
terminated so long as such parties remain in full compliance.
|
||||
|
||||
9. You are not required to accept this License, since you have not
|
||||
signed it. However, nothing else grants you permission to modify or
|
||||
distribute the Library or its derivative works. These actions are
|
||||
prohibited by law if you do not accept this License. Therefore, by
|
||||
modifying or distributing the Library (or any work based on the
|
||||
Library), you indicate your acceptance of this License to do so, and
|
||||
all its terms and conditions for copying, distributing or modifying
|
||||
the Library or works based on it.
|
||||
|
||||
10. Each time you redistribute the Library (or any work based on the
|
||||
Library), the recipient automatically receives a license from the
|
||||
original licensor to copy, distribute, link with or modify the Library
|
||||
subject to these terms and conditions. You may not impose any further
|
||||
restrictions on the recipients' exercise of the rights granted herein.
|
||||
You are not responsible for enforcing compliance by third parties with
|
||||
this License.
|
||||
|
||||
11. If, as a consequence of a court judgment or allegation of patent
|
||||
infringement or for any other reason (not limited to patent issues),
|
||||
conditions are imposed on you (whether by court order, agreement or
|
||||
otherwise) that contradict the conditions of this License, they do not
|
||||
excuse you from the conditions of this License. If you cannot
|
||||
distribute so as to satisfy simultaneously your obligations under this
|
||||
License and any other pertinent obligations, then as a consequence you
|
||||
may not distribute the Library at all. For example, if a patent
|
||||
license would not permit royalty-free redistribution of the Library by
|
||||
all those who receive copies directly or indirectly through you, then
|
||||
the only way you could satisfy both it and this License would be to
|
||||
refrain entirely from distribution of the Library.
|
||||
|
||||
If any portion of this section is held invalid or unenforceable under any
|
||||
particular circumstance, the balance of the section is intended to apply,
|
||||
and the section as a whole is intended to apply in other circumstances.
|
||||
|
||||
It is not the purpose of this section to induce you to infringe any
|
||||
patents or other property right claims or to contest validity of any
|
||||
such claims; this section has the sole purpose of protecting the
|
||||
integrity of the free software distribution system which is
|
||||
implemented by public license practices. Many people have made
|
||||
generous contributions to the wide range of software distributed
|
||||
through that system in reliance on consistent application of that
|
||||
system; it is up to the author/donor to decide if he or she is willing
|
||||
to distribute software through any other system and a licensee cannot
|
||||
impose that choice.
|
||||
|
||||
This section is intended to make thoroughly clear what is believed to
|
||||
be a consequence of the rest of this License.
|
||||
|
||||
12. If the distribution and/or use of the Library is restricted in
|
||||
certain countries either by patents or by copyrighted interfaces, the
|
||||
original copyright holder who places the Library under this License may add
|
||||
an explicit geographical distribution limitation excluding those countries,
|
||||
so that distribution is permitted only in or among countries not thus
|
||||
excluded. In such case, this License incorporates the limitation as if
|
||||
written in the body of this License.
|
||||
|
||||
13. The Free Software Foundation may publish revised and/or new
|
||||
versions of the Lesser General Public License from time to time.
|
||||
Such new versions will be similar in spirit to the present version,
|
||||
but may differ in detail to address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the Library
|
||||
specifies a version number of this License which applies to it and
|
||||
"any later version", you have the option of following the terms and
|
||||
conditions either of that version or of any later version published by
|
||||
the Free Software Foundation. If the Library does not specify a
|
||||
license version number, you may choose any version ever published by
|
||||
the Free Software Foundation.
|
||||
|
||||
14. If you wish to incorporate parts of the Library into other free
|
||||
programs whose distribution conditions are incompatible with these,
|
||||
write to the author to ask for permission. For software which is
|
||||
copyrighted by the Free Software Foundation, write to the Free
|
||||
Software Foundation; we sometimes make exceptions for this. Our
|
||||
decision will be guided by the two goals of preserving the free status
|
||||
of all derivatives of our free software and of promoting the sharing
|
||||
and reuse of software generally.
|
||||
|
||||
NO WARRANTY
|
||||
|
||||
15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO
|
||||
WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW.
|
||||
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR
|
||||
OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY
|
||||
KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE
|
||||
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
|
||||
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE
|
||||
LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME
|
||||
THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
|
||||
|
||||
16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
|
||||
WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY
|
||||
AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU
|
||||
FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR
|
||||
CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE
|
||||
LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING
|
||||
RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A
|
||||
FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF
|
||||
SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH
|
||||
DAMAGES.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
How to Apply These Terms to Your New Libraries
|
||||
|
||||
If you develop a new library, and you want it to be of the greatest
|
||||
possible use to the public, we recommend making it free software that
|
||||
everyone can redistribute and change. You can do so by permitting
|
||||
redistribution under these terms (or, alternatively, under the terms of the
|
||||
ordinary General Public License).
|
||||
|
||||
To apply these terms, attach the following notices to the library. It is
|
||||
safest to attach them to the start of each source file to most effectively
|
||||
convey the exclusion of warranty; and each file should have at least the
|
||||
"copyright" line and a pointer to where the full notice is found.
|
||||
|
||||
{description}
|
||||
Copyright (C) {year} {fullname}
|
||||
|
||||
This library is free software; you can redistribute it and/or
|
||||
modify it under the terms of the GNU Lesser General Public
|
||||
License as published by the Free Software Foundation; either
|
||||
version 2.1 of the License, or (at your option) any later version.
|
||||
|
||||
This library is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
||||
Lesser General Public License for more details.
|
||||
|
||||
You should have received a copy of the GNU Lesser General Public
|
||||
License along with this library; if not, write to the Free Software
|
||||
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301
|
||||
USA
|
||||
|
||||
Also add information on how to contact you by electronic and paper mail.
|
||||
|
||||
You should also get your employer (if you work as a programmer) or your
|
||||
school, if any, to sign a "copyright disclaimer" for the library, if
|
||||
necessary. Here is a sample; alter the names:
|
||||
|
||||
Yoyodyne, Inc., hereby disclaims all copyright interest in the
|
||||
library `Frob' (a library for tweaking knobs) written by James Random
|
||||
Hacker.
|
||||
|
||||
{signature of Ty Coon}, 1 April 1990
|
||||
Ty Coon, President of Vice
|
||||
|
||||
That's all there is to it!
|
||||
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue