1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
2430
2431
2432
2433
2434
2435
2436
2437
2438
2439
2440
2441
2442
2443
2444
2445
2446
2447
2448
2449
2450
2451
2452
2453
2454
2455
2456
2457
2458
2459
2460
2461
2462
2463
2464
2465
2466
2467
2468
2469
2470
2471
2472
2473
2474
2475
2476
2477
2478
2479
2480
2481
2482
2483
2484
2485
2486
2487
2488
2489
2490
2491
2492
2493
2494
2495
2496
2497
2498
2499
2500
2501
2502
2503
2504
2505
2506
2507
2508
2509
2510
2511
2512
2513
2514
2515
2516
2517
2518
2519
2520
2521
2522
2523
2524
2525
2526
2527
2528
2529
2530
2531
2532
2533
2534
2535
2536
2537
2538
2539
2540
2541
2542
2543
2544
2545
2546
2547
2548
2549
2550
2551
2552
2553
2554
2555
2556
2557
2558
2559
2560
2561
2562
2563
2564
2565
2566
2567
2568
2569
2570
2571
2572
2573
2574
2575
2576
2577
2578
2579
2580
2581
2582
2583
2584
2585
2586
2587
2588
2589
2590
2591
2592
2593
2594
2595
2596
2597
2598
2599
2600
2601
2602
2603
2604
2605
2606
2607
2608
2609
2610
2611
2612
2613
2614
2615
2616
2617
2618
2619
2620
2621
2622
2623
2624
2625
2626
2627
2628
2629
2630
2631
2632
2633
2634
2635
2636
2637
2638
2639
2640
2641
2642
2643
2644
2645
2646
2647
2648
2649
2650
2651
2652
2653
2654
2655
2656
2657
2658
2659
2660
2661
2662
2663
2664
2665
2666
2667
2668
2669
2670
2671
2672
2673
2674
2675
2676
2677
2678
2679
2680
2681
2682
2683
2684
2685
2686
2687
2688
2689
2690
2691
2692
2693
2694
2695
2696
2697
2698
2699
2700
2701
2702
2703
2704
2705
2706
2707
2708
2709
2710
2711
2712
2713
2714
2715
2716
2717
2718
2719
2720
2721
2722
2723
2724
2725
2726
2727
2728
2729
2730
2731
2732
2733
2734
2735
2736
2737
2738
2739
2740
2741
2742
2743
2744
2745
2746
2747
2748
2749
2750
2751
2752
2753
2754
2755
2756
2757
2758
2759
2760
2761
2762
2763
2764
2765
2766
2767
2768
2769
2770
2771
2772
2773
2774
2775
2776
2777
2778
2779
|
// file : doc/intro.cli
// copyright : Copyright (c) 2014-2019 Code Synthesis Ltd
// license : MIT; see accompanying LICENSE file
"\name=build2-toolchain-intro"
"\subject=toolchain"
"\title=Toolchain Introduction"
// TODO
//
// @@ refs to further docs
//
// STYLE
//
// @@ section boundary page breaks (<hr class="page-break"/>)
// @@ when printed, code background is gone, but spaces still there
//
// PDF
//
// @@ tree output is garbled
// @@ Could we use a nicer font, seeing that we embed them?
//
// NOTES
//
// - Maximum <pre> line is 70 characters.
//
"
\h0#preface|Preface|
This document is an overall introduction to the \c{build2} toolchain that
shows how the main components, namely the build system, the package dependency
manager, and the project dependency manager are used together to handle the
entire C++ project development lifecycle: creation, development, testing, and
delivery. For additional information, including documentation for individual
toolchain components, man pages, etc., refer to the \c{build2} project
\l{https://build2.org/doc.xhtml Documentation} page.
\h1#tldr|TL;DR|
\
$ git clone ssh://example.org/hello.git
$ tree hello
hello/
├── hello/
│ ├── hello.cxx
│ └── buildfile
├── manifest
└── repositories.manifest
$ cd hello
$ bdep init --config-create ../hello-gcc cc config.cxx=g++
initializing in project /tmp/hello/
created configuration /tmp/hello-gcc/ default,auto-synchronized
synchronizing:
new hello/0.1.0
$ b
c++ hello/cxx{hello}@../hello-gcc/hello/hello/
ld ../hello-gcc/hello/hello/exe{hello}
ln ../hello-gcc/hello/hello/exe{hello} -> hello/
$ hello/hello World
Hello, World!
$ edit repositories.manifest # add https://example.org/libhello.git
$ edit manifest # add 'depends: libhello ^1.0.0'
$ edit hello/buildfile # import libhello
$ edit hello/hello.cxx # use libhello
$ b
fetching from https://example.org/libhello.git
synchronizing /tmp/hello-gcc/:
new libhello/1.0.0 (required by hello)
reconfigure hello/0.1.0
c++ ../hello-gcc/libhello-1.0.0/libhello/cxx{hello}
ld ../hello-gcc/libhello-1.0.0/libhello/libs{hello}
c++ hello/cxx{hello}@../hello-gcc/hello/hello/
ld ../hello-gcc/hello/hello/exe{hello}
ln ../hello-gcc/hello/hello/exe{hello} -> hello/
$ bdep fetch # refresh available versions
$ bdep status -i # review available versions
hello configured 0.1.0
libhello ^1.0.0 configured 1.0.0 available [1.1.0]
$ bdep sync libhello # upgrade to latest
synchronizing:
new libformat/1.0.0 (required by libhello)
new libprint/1.0.0 (required by libhello)
upgrade libhello/1.1.0
reconfigure hello/0.1.0
$ bdep sync libhello/1.0.0 # downgrade
synchronizing:
drop libprint/1.0.0 (unused)
drop libformat/1.0.0 (unused)
downgrade libhello/1.0.0
reconfigure hello/0.1.0
\
\h1#guide|Getting Started Guide|
The aim of this guide is to get you started developing C/C++ projects with the
\c{build2} toolchain. All the examples in this section include the relevant
command output so if you just want to get a sense of what \c{build2} is about,
then you don't have to install the toolchain and run the commands in order to
follow along. Or, alternatively, you can take a short detour to the
\l{https://build2.org/install.xhtml Installation Instructions} and then try
the examples for yourself.
One of the primary goals of the \c{build2} toolchain is to provide a uniform
interface across all the platforms and compilers. While most of the examples
in this document assume a UNIX-like operation system, they will look pretty
similar if you are on Windows. You just have to use appropriate paths,
compilers, and options.
The question we will try to answer in this section can be summarized as:
\
$ git clone .../hello.git && now-what?
\
That is, we clone an existing C/C++ project or would like to create a new one
and then start hacking on it. We want to spend as little time and energy as
possible on the initial and ongoing infrastructure maintenance: setting up
build configurations, managing dependencies, continuous integration and
testing, release management, etc. Or, as one C++ user aptly put it, \"\i{All I
want to do is program.}\"
\h#guide-hello|Hello, World|
Let's see what programming with \c{build2} feels like by starting with a
customary \i{\"Hello, World!\"} program (here we assume our current working
directory is \c{/tmp}):
\
$ bdep new -t exe -l c++ hello
created new executable project hello in /tmp/hello/
\
The \l{bdep-new(1)} command creates a \i{canonical} \c{build2} project. In
our case it is an executable implemented in C++.
\N|To create a library, pass \c{-t\ lib}. By default \c{new} also initializes
a \c{git} repository and generates suitable \c{.gitignore} files (pass \c{-s\
none} if you don't want that). And for details on naming your projects, see
\l{bpkg#package-name Package Name}.|
\N|Note to Windows users: the \c{build2-baseutils} package includes core
\c{git} utilities that are sufficient for the \c{bdep} functionality.|
Let's take a look inside our new project:
\
$ tree hello
hello/
├── .git/
├── .bdep/
├── build/
├── hello/
│ ├── hello.cxx
│ ├── buildfile
│ └── testscript
├── buildfile
├── manifest
└── repositories.manifest
\
\N|While the canonical project structure is strongly recommended, especially
for new projects, \c{build2} is flexible enough to allow most commonly used
arrangements. See \l{#proj-struct Canonical Project Structure} for the more
detailed discussion and rationale behing this layout.|
Similar to version control tools, we normally run all \c{build2} tools from
the project's source directory or one of its subdirectories, so:
\
$ cd hello
\
While the project layout is discussed in more detail in later sections, let's
examine a couple of interesting files to get a sense of what's going on. We
start with the source file which should look familiar:
\
$ cat hello/hello.cxx
#include <iostream>
int main (int argc, char* argv[])
{
using namespace std;
if (argc < 2)
{
cerr << \"error: missing name\" << endl;
return 1;
}
cout << \"Hello, \" << argv[1] << '!' << endl;
}
\
\N|If you prefer the \c{.?pp} extensions over \c{.?xx} for your C++ source
files, pass \c{-l\ c++,cpp} to the \c{new} command. See \l{bdep-new(1)} for
details on this and other customization options.|
Let's take a look at the accompanying \c{buildfile}:
\
$ cat hello/buildfile
libs =
#import libs += libhello%lib{hello}
exe{hello}: {hxx ixx txx cxx}{**} $libs testscript
\
As the name suggests, this file describes how to build things. While its
content might look a bit cryptic, let's try to infer a couple of points
without going into too much detail (for details see the build system
\l{b#intro Introduction}).
That \c{exe{hello\}} on the left of \c{:} is a \i{target} (executable named
\c{hello}) and what we have on the right are \i{prerequisites} (C++ source
files, libraries, etc). This \c{buildfile} uses wildcard patterns (that
\c{**}) to automatically locate all the C++ source files. This means we don't
have to edit our \c{buildfile} every time we add, remove, or rename a source
file in our project. There also appears to be some (commented out)
infrastructure for importing and linking libraries (that \c{libs}
variable). We will see how to use it in a moment.
\N|In simple projects that follow the canonical structure we can often
completely ignore the presence of the build definition files thus approaching
the \i{build system-less} workflow found in languages like Rust and Go.|
Finally, the \c{buildfile} also lists \c{testscript} as a prerequisite of
\c{hello}. This file tests our target. Let's take a look inside:
\
$ cat hello/testscript
: basics
:
$* 'World' >'Hello, World!'
: missing-name
:
$* 2>>EOE != 0
error: missing name
EOE
\
Again, we are not going into detail here (see \l{testscript#intro Testscript
Introduction} for a proper introduction), but to give you an idea, here we
have two tests: the first (with id \c{basics}) verifies that our program
prints the expected greeting while the second makes sure it handles the
missing name error condition. Tests written in Testscript are concise,
portable, and executed in parallel.
Next up is \c{manifest}:
\
$ cat manifest
: 1
name: hello
version: 0.1.0-a.0.z
summary: hello executable
license: TODO
url: https://example.org/hello
email: you@example.org
#depends: libhello ^1.0.0
\
The \c{manifest} file is what makes a build system project a \i{package}. It
contains all the metadata that a user of a package might need to know: its
name, version, license, dependencies, etc., all in one place.
\N|Refer to \l{bpkg#manifest-format Manifest Format} for the general format of
\c{build2} manifest files and to \l{bpkg#manifest-package Package Manifest}
for details on the package manifest values.|
As you can see, \c{manifest} created by \l{bdep-new(1)} contains some dummy
values which you would want to adjust before publishing your package. But
let's resist the urge to adjust that strange looking \c{0.1.0-a.0.z} until we
discuss package versioning.
\N|Next to \c{manifest} you might have noticed the \c{repositories.manifest}
file \- we will discuss its function later, when we talk about dependencies
and where they come from.|
Project in hand, let's build it. Unlike other programming languages, C++
development usually involves juggling a handful of build configurations:
several compilers and/or targets (\c{build2} is big on cross-compiling),
debug/release, different sanitizers and/or static analysis tools, and so
on. As a result, \c{build2} is optimized for multi-configuration
usage. However, as we will see shortly, one build configuration can be
designated as the default with additional conveniences.
The \l{bdep-init(1)} command is used to initialize a project in a build
configuration. As a shortcut, it can also create a new build configuration in
the process, which is just what we need here. Let's start with GCC (remember
we are in the project's root directory):
\
$ bdep init -C ../hello-gcc @gcc cc config.cxx=g++
initializing in project /tmp/hello/
created configuration @gcc /tmp/hello-gcc/ default,auto-synchronized
synchronizing:
new hello/0.1.0-a.0.19700101000000
\
The \cb{--create|-C} option instructs \c{init} to create a new configuration
in the specified directory (\c{../hello-gcc} in our case). To make referring
to configurations easier, we can give it a name, which is what we do with
\c{@gcc}. The next argument (\c{cc}, stands for \i{C-common}) is the build
system module we would like to configure. It implements compilation and
linking rules for the C and C++ languages. Finally, \c{config.cxx=g++} is (one
of) this module's configuration variables that specifies the C++ compiler we
would like to use (the corresponding C compiler will be determined
automatically). Let's for now also ignore that \c{synchronizing:...} bit along
with strange-looking \c{19700101000000} in the version \- it will become clear
what's going on here in a moment.
\N|Note to Windows users: a command line argument with leading \c{@} has
a special meaning in PowerShell. To work around this, you can use the
alternative \c{-@gcc} syntax or the \c{-n\ gcc} option.|
Now the same for Clang:
\
$ bdep init -C ../hello-clang @clang cc config.cxx=clang++
initializing in project /tmp/hello/
created configuration @clang /tmp/hello-clang/ auto-synchronized
synchronizing:
new hello/0.1.0-a.0.19700101000000
\
If we check the parent directory, we should now see two build configurations
next to our project:
\
$ ls ..
hello/
hello-gcc/
hello-clang/
\
\N|If, as in the above examples, our configuration directories are next to the
project and their names are in the \c{\i{prj-name}\b{-}\i{cfg-name}} form,
then we can use the shortcut version of the \c{init} command:
\
$ bdep init -C @clang cc config.cxx=clang++
\
|
Things will also look pretty similar if you are on Windows instead of a
UNIX-like operating system. For example, to initialize our project on Windows
with Visual Studio, start the Visual Studio development command prompt and
then run:
\N|Currently we have to run \c{build2} tools from a suitable Visual Studio
development command prompt. This requirement will likely be removed in the
future.|
\
> bdep init -C ..\hello-debug @debug cc ^
config.cxx=cl ^
\"config.cc.coptions=/MDd /Od /Zi\" ^
config.cc.loptions=/DEBUG
> bdep init -C ..\hello-release @release cc ^
config.cxx=cl ^
config.cc.coptions=/O2
\
\N|Besides the \c{coptions} (compile options) and \c{loptions} (link options),
other commonly used \c{cc} module configuration variables are \c{poptions}
(preprocess options) and \c{libs} (extra libraries to link). We can also use
their \c{config.c.*} (C compilation) and \c{config.cxx.*} (C++ compilation)
variants if we only want them applied during the respective language
compilation. For example:
\
$ bdep init ... cc \
config.cxx=clang++ \
config.cc.coptions=-g \
config.cxx.coptions=-stdlib=libc++
\
|
One difference you might have noticed when creating the \c{gcc} and \c{clang}
configurations above is that the first one was designated as the default. The
default configuration is used by \c{bdep} commands if no configuration is
specified explicitly (see \l{bdep-projects-configs(1)} for details). It is
also the configuration that is used if we run the build system in the
project's source directory. So, normally, you would make your every day
development configuration the default. Let's try that:
\
$ bdep status
hello configured 0.1.0-a.0.19700101000000
$ b
c++ hello/cxx{hello}@../hello-gcc/hello/hello/
ld ../hello-gcc/hello/hello/exe{hello}
ln ../hello-gcc/hello/hello/exe{hello} -> hello/
$ b test
test hello/testscript{testscript} ../hello-gcc/hello/hello/exe{hello}
$ hello/hello World
Hello, World!
\
\N|To see the actual compilation command lines, run \c{b\ -v} and for even
more details, run \c{b\ -V}. See \l{b(1)} for more information on these
and other build system options.|
In contrast, the Clang configuration has to be requested explicitly:
\
$ bdep status @clang
hello configured 0.1.0-a.0.19700101000000
$ b ../hello-clang/hello/
c++ hello/cxx{hello}@../hello-clang/hello/hello/
ld ../hello-clang/hello/hello/exe{hello}
$ b test: ../hello-clang/hello/
test hello/testscript{testscript} ../hello-clang/hello/hello/exe{hello}
$ ../hello-clang/hello/hello/hello World
Hello, World!
\
As you can see, using the build system directly on configurations other than
the default requires explicitly specifying their paths. It would have been
more convenient if we could refer to them by names. The \l{bdep-update(1)} and
\l{bdep-test(1)} commands allow us to do exactly that:
\
$ bdep test @clang
c++ hello/cxx{hello}@../hello-clang/hello/hello/
ld ../hello-clang/hello/hello/exe{hello}
test hello/testscript{testscript} ../hello-clang/hello/hello/exe{hello}
\
And we can also perform the desired build system operation on several (or
\c{--all|-a}) configurations at once:
\
$ bdep test @gcc @clang
in configuration @gcc:
test hello/testscript{testscript} ../hello-gcc/hello/hello/exe{hello}
in configuration @clang:
test hello/testscript{testscript} ../hello-clang/hello/hello/exe{hello}
\
\N|As we will see later, the \l{bdep-test(1)} command also allows us to test
immediate (\c{--immediate|-i}) or all (\c{--recursive|-r}) dependencies of our
project. We call it \i{deep testing}.|
While we are here, let's also check how hard it would be to cross-compile:
\
$ bdep init -C ../hello-mingw @mingw cc config.cxx=x86_64-w64-mingw32-g++
initializing in project /tmp/hello/
created configuration @mingw /tmp/hello-mingw/ auto-synchronized
synchronizing:
new hello/0.1.0-a.0.19700101000000
$ bdep update @mingw
c++ hello/cxx{hello}@../hello-mingw/hello/hello/
ld ../hello-mingw/hello/hello/exe{hello}
\
As you can see, cross-compiling in \c{build2} is nothing special. In our case,
on a properly setup GNU/Linux machine (that automatically uses \c{wine} as an
\c{.exe} interpreter) we can even run tests (in \c{build2} this is called
\i{cross-testing}):
\
$ bdep test @mingw
test hello/testscript{testscript} ../hello-mingw/hello/hello/exe{hello}
$ ../hello-mingw/hello/hello/hello.exe Windows
Hello, Windows!
\
Let's review what it takes to initialize a project's infrastructure and
perform the first build. For an existing project:
\
$ git clone .../hello.git
$ cd hello
$ bdep init -C ../hello-gcc @gcc cc config.cxx=g++
$ b
\
For a new project:
\
$ bdep new -t exe -l c++ hello
$ cd hello
$ bdep init -C ../hello-gcc @gcc cc config.cxx=g++
$ b
\
If you prefer, the \c{new} and \c{init} steps can be combined into a single
command:
\
$ bdep new -t exe -l c++ hello -C hello-gcc @gcc cc config.cxx=g++
\
And if you need to deinitialize a project in one or more build configurations,
there is the \l{bdep-deinit(1)} command for that:
\
$ bdep deinit @gcc @clang
deinitializing in project /tmp/hello/
in configuration @gcc:
synchronizing:
drop hello
in configuration @clang:
synchronizing:
drop hello
\
As mentioned earlier, by default \l{bdep-new(1)} initializes a \c{git}
repository for us. Now that we have successfully built and tested our project,
it might be a good idea to make a first commit and publish it to a remote
repository where others can find it. Using GitHub as an example:
\
$ git add .
$ git commit -m \"Initial implementation\"
$ git remote add origin git@github.com:john-doe/hello.git
$ git push -u
\
While we have managed to test a couple of platforms (Linux and Windows) and
compiler versions (Clang and GCC) locally, there are quite a few combinations
that we haven't tried (Mac OS with Apple Clang and Windows with MSVC, to name
the major ones). We could test them manually, some with the help of
virtualization while for others (such as Mac OS) we may need physical
hardware. Add a few versions for each compiler and we are looking at a dozen
build configurations. Needless to say, testing on all of them manually is a
lot of work. Now that we have our project available from a public remote
repository, we can instead use the remote testing functionality offered by the
\l{bdep-ci(1)} command. For example:
\
$ bdep ci
submitting:
to: https://ci.cppget.org
in: https://github.com/john-doe/hello.git#master@93e1dbc94baa
package: hello
version: 0.1.0-a.0.20180907091517.93e1dbc94baa
continue? [y/n] y
############################################################# 100.0%
CI request is queued:
https://ci.cppget.org/@d6ee90f4-21a9-47a0-ab5a-7cd2f521d3d8
\
Let's see what's going on here. By default \c{ci} submits a test request to
\l{https://ci.cppget.org ci.cppget.org}, a public CI service run by the
\c{build2} project (see available \l{https://ci.cppget.org?build-configs Build
Configurations} and \l{https://ci.cppget.org?ci Use Policies}). It is testing
the current working tree state (branch and commit) of our package which should
be available from our remote repository (on GitHub in this example) since
that's where the CI service expects to find it. In response we get a URL where
we can see the build and test results, logs, etc.
\N|This \i{push} CI model works particularly well with the \"feature branch\"
development workflow. Specifically, you would develop a new feature in a
separate branch, publishing and remote-testing it as necessary. When the
feature is ready, you would merge any changes from \c{master}, test the result
one more time, and then merge the feature into master.|
Now is a good time to get an overview of the \c{build2} toolchain. After all,
we have already used two of its tools (\c{bdep} and \c{b}) without a clear
understanding of what they actually are.
Unlike most other programming languages that encapsulate the build system,
package dependency manager, and project dependency manager into a single tool
(such as Rust's \c{cargo} or Go's \c{go}), \c{build2} is a hierarchy of
several tools that you will be using directly and which together with your
version control system (VCS) will constitute the core of your project
management toolset.
\N|While \c{build2} can work without a VCS, this will result in reduced
functionality.|
At the bottom of the hierarchy is the build system, \l{b(1)}. Next comes the
package dependency manager, \l{bpkg(1)}. It is primarily used for \i{package
consumption} and depends on the build system. The top of the hierarchy is the
project dependency manager, \l{bdep(1)}. It is used for \i{project
development} and relies on \c{bpkg} for building project packages and their
dependencies.
\N|The main reason for this separation is modularity and the resulting
flexibility: there are situations where we only need the build system (for
example, when building a package for a system package manager where all the
dependencies should be satisfied from the system repository), or only the
build system and package manager (for example, when a build bot is building a
package for testing).
Note also that strictly speaking \c{build2} is not C/C++-specific; its build
model is general enough to handle any DAG-based operations and its
package/project dependency management can be used for any compiled language.|
\N|As we will see in a moment, \c{build2} also integrates with your VCS in
order to automate project versioning. Note that currently only \c{git(1)} is
supported.|
Now that we understand the tooling, let's also revisit the notion of \i{build
configuration} (those \c{hello-gcc} and \c{hello-clang} directories). A
\c{bdep} build configuration is actually a \c{bpkg} build configuration which,
in the build system terms, is an \i{amalgamation} \- a project that contains
\i{subprojects}. In our case, the subprojects in these amalgamations will be
the projects we have initialized with \c{init} and, as we will see later,
packages that they depend on. For example, here is what our \c{hello-gcc}
contains:
\
$ tree hello-gcc
hello-gcc/
├── .bpkg/
├── build/
│ └── config.build
└── hello/
├── build/
│ └── config.build
└── hello/
├── hello
└── hello.o
\
\N|Underneath \l{bdep-init(1)} with the \c{--config-create|-C} option calls
\l{bpkg-cfg-create(1)} which, in turn, performs the build system \c{create}
meta-operation (see \l{b(1)} for details).
The important point here is that the \c{bdep} build configuration is not a
black box that you should never look inside of. On the contrary, it is a
normal and predictable concept of the package manager and the build system and
as long as you understand what you are doing, you should feel free to interact
with it directly.|
Let's now move on to the reason why there is \i{dep} in the \c{bdep} name:
dependency management.
\h#guide-repositories|Package Repositories|
Say we have realized that writing \i{\"Hello, World!\"} programs is a fairly
common task and that someone must have written a library to help with that. So
let's see if we can find something suitable to use in our project.
Where should we look? That's a good question. But before we can try to answer
it, we need to understand where \c{build2} can source dependencies. In
\c{build2} packages come from \i{package repositories}. Two commonly used
repository types are \i{version control} and \i{archive}-based (see
\l{bpkg-repository-types(1)} for details).
As the name suggests, a version control-based repository uses a VCS as its
distribution mechanism. \N{Currently, only \c{git} is supported.} Such a
repository normally contains multiple versions of a single package or,
perhaps, of a few related packages.
An archive-based repository contains multiple, potentially unrelated
packages/versions as archives along with some meta information (package list,
prerequisite/complement repositories, signatures, etc) that are all accessible
via HTTP(S).
Version control and archive-based repositories have different
trade-offs. Version control-based repositories are great for package
developers: With services like GitHub they are trivial to setup. In fact, your
project's (already existing) VCS repository will normally be the \c{build2}
package repository \- you might need to add a few files, but that's about it.
However, version control-based repositories are not without drawbacks: It will
be hard for your users to discover your packages (try searching for \"hello
library\" on GitHub \- most of the results are not even in C++ let alone
packaged for \c{build2}). There is also the issue of continuous availability:
users can delete their repositories, services may change their policies or go
out of business, and so on. Version control-based repositories also lack
repository authentication and package signing. Finally, obtaining the
available package list for such repositories can be slow.
A central, archive-based repository would address all these drawbacks: It
would be a single place to search for packages. Published packages will never
disappear and can be easily mirrored. Packages are signed and the repository
is authenticated (see \l{bpkg-repository-signing(1)} for details). And, last,
but not least, archive-based repositories are fast.
\l{https://cppget.org cppget.org} is the \c{build2} community's central
package repository. While centralized, it is also easy to mirror since its
contents are accessible via plain HTTPS (you can browse
\l{https://pkg.cppget.org pkg.cppget.org} to get an idea). As an added
benefit, packages on \l{https://cppget.org cppget.org} are continuously
\l{https://cppget.org/?builds built and tested} on all the major
platform/compiler combinations with the results available as part of the
package description.
\N|The main drawback of archive-based repositories is the setup cost. Getting
a basic repository going is relatively easy \- all you need is an HTTP(S)
server. Adding a repository web interface like that on \l{https://cppget.org
cppget.org} will require running \l{https://cppget.org/brep \c{brep}}. And
adding CI will require running a bunch of build bots
(\l{https://cppget.org/bbot \c{bbot}}). Note also that in \c{build2}
archive-based repositories can be federated with different sections of
the repository being hosted/managed potentially independently.|
To summarize, version control-based repositories are great for package
developers while a central, archive-based repository is convenient for package
consumers. A reasonable strategy is then for package developers to publish
their releases to a central repository. Package consumers can then decide
which repository to use based on their needs. For example, one could use
\l{https://cppget.org cppget.org} as a (fast, reliable, and secure) source of
stable versions but also add, say, \c{git} repositories for select packages
(perhaps with the \c{#HEAD} fragment filter to improve download speed) for
testing development snapshots. In this model the two repository types
complement each other.
\N|Publishing of packages to archive-based repositories is discussed in
\l{#guide-versioning-releasing Versioning and Release Management}.|
Let's see how all this works in practice. Go over to \l{https://cppget.org
cppget.org} and type \"hello library\" in the search box. At the top of the
search result you should see the \l{https://cppget.org/libhello \c{libhello}}
package and if you follow the link you will see the package description page
along with a list of available versions. Pick a version that you like and you
will see the package version description page with quite a bit of information,
including the list of platform/compiler combinations that this version has
been successfully (or unsuccessfully) tested with. If you like what you see,
copy the \c{location} value \- this is the repository location where this
package version can be sourced from.
\N|The \l{https://cppget.org cppget.org} repository is split into several
sections: \c{stable}, \c{testing}, \c{beta}, \c{alpha} and \c{legacy}, with
each section having its own repository location (see the repository's
\l{https://cppget.org/?about about} page for details on each section's
policies). Note also that \c{testing} is complemented by \c{stable}, \c{beta}
by \c{testing}, and so on, so you only need to choose the lowest stability
level and you will automatically \"see\" packages from the more stable
sections.|
\N|The \l{https://cppget.org cppget.org} \c{stable} sections will always
contain the \c{libhello} library version \c{1.0.X} that was generated using
the following \l{bdep-new(1)} command line:
\
$ bdep new -t lib -l c++ libhello
\
It can be used as a predictable test dependency when setting up new projects.|
Let's say we've visited the \c{libhello} project's
\l{https://git.build2.org/cgit/hello/libhello/ home page} (for example by
following a link from the package details page) and noticed that it is being
developed in a \c{git} repository. How can we see what's available there? If
the releases are tagged, then we can infer the available released versions
from the tags. But that doesn't tell us anything about what's happening on the
\c{HEAD} or in the branches. For that we can use the package manager's
\l{bpkg-rep-info(1)} command:
\
$ bpkg rep-info https://git.build2.org/hello/libhello.git
libhello/1.0.0
libhello/1.1.0
\
As you can see, besides \c{1.0.0} that we have seen in \c{cppget.org/stable},
there is also \c{1.1.0} (which is perhaps being tested in
\c{cppget.org/testing}). We can also check what might be available on the
\c{HEAD} (see \l{bpkg-repository-types(1)} for details on the \c{git}
repository URL format):
\
$ bpkg rep-info https://git.build2.org/hello/libhello.git#HEAD
libhello/1.1.1-a.0.20180504111511.2e82f7378519
\
\N|We can also use the \c{rep-info} command on archive-based repositories,
however, if available, the web interface is usually more convenient and
provides more information.|
To summarize, we found two repositories for the \c{libhello} package: the
archive-based \l{https://cppget.org cppget.org} that contains the released
versions as well as its development \c{git} repository where we can get the
bleeding edge stuff. Let's now see how we can add \c{libhello} to our
project.
\h#guide-add-remove-deps|Adding and Removing Dependencies|
So we found \c{libhello} that we would like to use in our \c{hello}
project. First, we edit the \c{repositories.manifest} file found in the root
directory of our project and add one of the \c{libhello} repositories as a
prerequisite. Let's start with \l{https://cppget.org cppget.org}:
\
role: prerequisite
location: https://pkg.cppget.org/1/stable
\
\N|Refer to \l{bpkg#manifest-repository Repository Manifest} for details on
the repository manifest values.|
Next, we edit the \c{manifest} file (again, found in the root of our project)
and specify the dependency on \c{libhello} with optional version constraint.
For example:
\
depends: libhello ^1.0.0
\
Let's briefly discuss version constraints (for details see the
\l{bpkg#manifest-package-depends \c{depends}} value documentation). A version
constraint can be expressed with a comparison operator (\c{==}, \c{>},
\c{<}, \c{>=}, \c{<=}), a range shortcut operator (\c{~} and \c{^}), or a
range. Here are a few examples:
\
depends: libhello == 1.2.3
depends: libhello >= 1.2.3
depends: libhello ~1.2.3
depends: libhello ^1.2.3
depends: libhello [1.2.3 1.2.9)
\
You may already be familiar with the tilde (\c{~}) and caret (\c{^})
constraints from dependency managers for other languages. To recap, tilde
allows upgrades to any further patch versions while caret also allows upgrades
to further minor versions. They are equivalent to the following ranges:
\
~X.Y.Z [X.Y.Z X.Y+1.0)
^X.Y.Z [X.Y.Z X+1.0.0) if X > 0
^0.Y.Z [0.Y.Z 0.Y+1.0) if X == 0
\
\N|Zero major version component is customarily used during early development
where the minor version effectively becomes major. As a result, the tilde
constraint has a special treatment of this case.|
Unless you have good reasons not to (for example, a dependency does not use
semantic versioning), we suggest that you use the \c{^} constraint which
provides a good balance between compatibility and upgradability with \c{~}
being a more conservative option.
Ok, we've specified where our package comes from (\c{repositories.manifest})
and which versions we find acceptable (\c{manifest}). The next step is to edit
\c{hello/buildfile} and import the \c{libhello} library into our build:
\
import libs += libhello%lib{hello}
\
Finally, we modify our source code to use the library:
\
#include <libhello/hello.hxx>
...
int main (int argc, char* argv[])
{
...
hello::say_hello (cout, argv[1]);
}
\
\N|You are probably wondering why we have to specify this repeating
information in so many places. Let's start with the source code: we can't
specify the version constraint or location there because it will have to be
repeated in every source file that uses the dependency.
Moving up, \c{buildfile} is also not a good place to specify this information
for the same reason (a library can be imported in multiple buildfiles) plus
the build system doesn't really know anything about version constraints or
repositories which is the purview of the dependency management tools.
Finally, we have to separate the version constraint and the location because
the same package can be present in multiple repositories with different
policies. For example, when a package from a version control-based repository
is published in an archive-based repository, its \c{repositories.manifest}
file is ignored and all its dependencies should be available from the
archive-based repository itself (or its fixed set of prerequisite
repositories). In other words, \c{manifest} belongs to a package while
\c{repositories.manifest} \- to a repository.
Also note that this is unlikely to become burdensome since adding new
dependencies is not something that happens often. There are also ideas to
automate this with a \c{bdep-add(1)} command in the future.|
To summarize, these are the files we had to modify to add a dependency
to our project:
\
repositories.manifest # add https://pkg.cppget.org/1/stable
manifest # add 'depends: libhello ^1.0.0'
buildfile # import libhello
hello.cxx # use libhello
\
With a new dependency added, let's check the status of our project:
\
$ bdep status
fetching pkg:cppget.org/stable (prerequisite of dir:/tmp/hello)
warning: authenticity of the certificate for pkg:cppget.org/stable
cannot be established
certificate is for cppget.org, \"Code Synthesis\" <admin@cppget.org>
certificate SHA256 fingerprint:
86:BA:D4:DE:2C:87:1A:EE:38:<...>:5A:EA:F4:F7:8C:1D:63:30:C6
trust this certificate? [y/n] y
hello configured 0.1.0-a.0.19700101000000
available 0.1.0-a.0.19700101000000#1
\
The \l{bdep-status(1)} command has detected that the dependency information
has changed and tells us that a new \i{iteration} of our project (that \c{#1})
is now available for \i{synchronization} with the build configuration.
We've also been prompted to authenticate the prerequisite repository. This
will have to happen once for every build configuration we initialize our
project in and can quickly become tedious. To overcome this, we can mention
the certificate fingerprint that we wish to automatically trust in the
\c{repositories.manifest} file (replace it with the actual fingerprint from
the repository's about page):
\
role: prerequisite
location: https://pkg.cppget.org/1/stable
trust: 86:BA:D4:DE:2C:87:1A:EE:38:<...>:5A:EA:F4:F7:8C:1D:63:30:C6
\
To synchronize a project with one or more build configurations we use the
\l{bdep-sync(1)} command:
\
$ bdep sync
synchronizing:
new libhello/1.0.0 (required by hello)
upgrade hello/0.1.0-a.0.19700101000000#1
\
Or we could just build the project without an explicit \c{sync} \- if
necessary, it will be automatically synchronized:
\
$ b
synchronizing:
new libhello/1.0.0 (required by hello)
upgrade hello/0.1.0-a.0.19700101000000#1
c++ ../hello-gcc/libhello-1.0.0/libhello/cxx{hello}
ld ../hello-gcc/libhello-1.0.0/libhello/libs{hello}
c++ hello/cxx{hello}@../hello-gcc/hello/hello/
ld ../hello-gcc/hello/hello/exe{hello}
ln ../hello-gcc/hello/hello/exe{hello} -> hello/
\
The synchronization as performed by the \c{sync} command is two-way:
dependency packages are first added, removed, upgraded, or downgraded in build
configurations according to the project's version constraints and user
input. Then the actual versions of the dependencies present in the build
configurations are recorded in the project's \c{lockfile} so that if desired,
the build can be reproduced exactly. \N{The \c{lockfile} functionality is not
yet implemented.} For a new dependency the latest available version that
satisfies the version constraint is used.
\N|Synchronization is also the last step in the \l{bdep-init(1)} command's
logic.|
Let's now examine the status in all (\c{--all|-a}) the build configurations
and include the immediate dependencies (\c{--immediate|-i}):
\
$ bdep status -ai
in configuration @gcc:
hello configured 0.1.0-a.0.19700101000000#1
libhello ^1.0.0 configured 1.0.0
in configuration @clang:
hello configured 0.1.0-a.0.19700101000000
available 0.1.0-a.0.19700101000000#1
\
Since we didn't specify a configuration explicitly, only the default (\c{gcc})
was synchronized. Normally, you would try a new dependency in one
configuration, make sure everything looks good, then synchronize the rest with
\c{--all|-a} (or, again, just build what you need directly). Here are a few
examples (see \l{bdep-projects-configs(1)} for details):
\
$ bdep sync -a
$ bdep sync @gcc @clang
$ bdep sync -c ../hello-mingw
\
After adding a new (or upgrading/downgrading existing) dependency, it's a good
idea to \i{deep-test} our project: run not only our own tests but also of its
immediate (\c{--immediate|-i}) or even all (\c{--recursive|-r}) dependencies.
For example:
\
$ bdep test -ai
in configuration @gcc:
test hello/testscript{testscript} ../hello-gcc/hello/hello/exe{hello}
test ../hello-gcc/libhello-1.0.0/tests/basics/exe{driver}
in configuration @clang:
test hello/testscript{testscript} ../hello-clang/hello/hello/exe{hello}
test ../hello-clang/libhello-1.0.0/tests/basics/exe{driver}
\
To get rid of a dependency, we simply remove it from the \c{manifest} file
and synchronize the project. For example, assuming \c{libhello} is no longer
mentioned as a dependency in our \c{manifests}:
\
$ bdep status
hello configured 0.1.0-a.0.19700101000000#1
available 0.1.0-a.0.19700101000000#2
$ bdep sync
synchronizing:
drop libhello/1.0.0 (unused)
upgrade hello/0.1.0-a.0.19700101000000#2
\
\N|If instead of building a dependency from source you would prefer to use a
version that is installed by your system package manager, see
\l{#guide-system-deps Using System-Installed Dependencies}. And for
information on using dependencies that are not \c{build2} packages refer to
\l{#guide-unpackaged-deps Using Unpackaged Dependencies}.|
\h#guide-upgrade-downgrade-deps|Upgrading and Downgrading Dependencies|
Let's say we would like to try that \c{1.1.0} version we have seen in
the \c{libhello} \c{git} repository. First, we need to add the
repository to the \c{repositories.manifest} file:
\
role: prerequisite
location: https://git.build2.org/hello/libhello.git
\
\N|Note that we don't need the \c{trust} value since \c{git} repositories
are not authenticated.|
To refresh the list of available dependency versions we use the
\l{bdep-fetch(1)} command (or the \c{--fetch|-f} option to \c{status}):
\
$ bdep fetch
$ bdep status libhello
libhello configured 1.0.0 available [1.1.0]
\
To upgrade (or downgrade) dependencies we again use the \l{bdep-sync(1)}
command. We can upgrade one or more specific dependencies by listing them
as arguments to \c{sync}:
\
$ bdep sync libhello
synchronizing:
new libformat/1.0.0 (required by libhello)
new libprint/1.0.0 (required by libhello)
upgrade libhello/1.1.0
upgrade hello/0.1.0-a.0.19700101000000#3
\
Without an explicit version or the \c{--patch|-p} option, \c{sync} will
upgrade the specified dependencies to the latest available versions. For
example, if we don't like version \c{1.1.0}, we can downgrade it back to
\c{1.0.0} by specifying the version explicitly (we pass \c{--old-available|-o}
to \c{status} to see the old versions):
\
$ bdep status -o libhello
libhello configured 1.1.0 available (1.1.0) [1.0.0]
$ bdep sync libhello/1.0.0
synchronizing:
drop libprint/1.0.0 (unused)
drop libformat/1.0.0 (unused)
downgrade libhello/1.0.0
reconfigure hello/0.1.0-a.0.19700101000000#3
\
\N|The available versions are listed in the descending order with \c{[]}
indicating that the version is only available as a dependency and \c{()}
marking the current version.|
Instead of specific dependencies we can also upgrade (\c{--upgrade|-u}) or
patch (\c{--patch|-p}) immediate (\c{--immediate|-i}) or all
(\c{--recursive|-r}) dependencies of our project.
As a more realistic example, version \c{1.1.0} of \c{libhello} depends on two
other libraries: \c{libformat} and \c{libprint}. Here is our project's
dependency tree while we were still using that version:
\
$ bdep status -r
hello configured 0.1.0-a.0.19700101000000#3
libhello ^1.0.0 configured 1.1.0
libformat ^1.0.0 configured 1.0.0
libprint ^1.0.0 configured 1.0.0
\
A typical conservative dependency management workflow would look like this:
\
$ bdep status -fi # refresh and examine immediate dependencies
hello configured 0.1.0-a.0.19700101000000#3
libhello configured 1.1.0 available [2.0.0] [1.2.0] [1.1.2] [1.1.1]
$ bdep sync -pi # upgrade immediate to latest patch version
synchronizing:
upgrade libhello/1.1.2
reconfigure hello/0.1.0-a.0.19700101000000#3
continue? [Y/n] y
\
Notice that in case of such mass upgrades you are prompted for confirmation
before anything is actually changed (unless you pass \c{--yes|-y}).
In contrast, the following would be a fairly aggressive workflow where we
upgrade everything to the latest available version (version constraints
permitting; here we assume \c{^1.0.0} was used for all the dependencies):
\
$ bdep status -fr # refresh and examine all dependencies
hello configured 0.1.0-a.0.19700101000000#3
libhello configured 1.1.0 available [2.0.0] [1.2.0] [1.1.1]
libprint configured 1.0.0 available [2.0.0] [1.1.0] [1.0.1]
libformat configured 1.0.0 available [2.0.0] [1.1.0] [1.0.1]
$ bdep sync -ur # upgrade all to latest available version
synchronizing:
upgrade libprint/1.1.0
upgrade libformat/1.1.0
upgrade libhello/1.2.0
reconfigure hello/0.1.0-a.0.19700101000000#3
continue? [Y/n] y
\
We can also have something in between: patch all (\c{sync\ -pr}), upgrade
immediate (\c{sync\ -ui}), or even upgrade immediate and patch the rest
(\c{sync\ -ui} followed by \c{sync\ -pr}).
\h#guide-versioning-releasing|Versioning and Release Management|
Let's now discuss versioning and release management and, yes, that
strange-looking \c{0.1.0-a.0.19700101000000} we keep seeing. While a build
system project doesn't need a version and a \c{bpkg} package can use custom
versioning schemes (see \l{bpkg#package-version Package Version}), a project
managed by \c{bdep} must use \i{standard versioning}. \N{A dependency, which
is a \c{bpkg} package, need not use standard versioning.}
Standard versioning (\i{stdver}) is a \l{https://semver.org semantic
versioning} (\i{semver}) scheme with a more precisely defined pre-release
component and without any build metadata.
\N|If you believe that \i{semver} is just \c{\i{major}.\i{minor}.\i{patch}},
then in your worldview \i{stdver} would be the same as \i{semver}. In reality,
\i{semver} also allows loosely defined pre-release and build metadata
components. For example, \c{1.2.3-beta.1+build.23456} is a valid \i{semver}.|
A standard version has the following form:
\c{\i{major}\b{.}\i{minor}\b{.}\i{patch}[\b{-}\i{prerel}]}
The \ci{major}, \ci{minor}, and \ci{patch} components have the same meaning as
in \i{semver}. The \ci{prerel} component is used to provide \i{continuous
versioning} of our project between releases. Specifically, during development
of a new version we may want to publish several pre-releases, for example,
alpha or beta. In between those we may also want to publish a number of
snapshots, for example, for CI. With continuous versioning all these releases,
pre-releases, and snapshots are assigned unique, properly ordered versions.
\N|Continuous versioning is a cornerstone of the \c{build2} project dependency
management. In case of snapshots, an appropriate version is assigned
automatically in cooperation with your VCS.|
The \ci{prerel} component for a pre-release has the following form:
\c{(\b{a}|\b{b})\b{.}\i{num}}
Here \cb{a} stands for alpha, \cb{b} stands for beta, and \ci{num} is the
alpha/beta number. For example:
\
1.1.0 # final release for 1.1.0
1.2.0-a.1 # first alpha pre-release for 1.2.0
1.2.0-a.2 # second alpha pre-release for 1.2.0
1.2.0-b.1 # first beta pre-release for 1.2.0
1.2.0 # final release for 1.2.0
\
The \ci{prerel} component for a snapshot has the following form:
\c{(\b{a}|\b{b})\b{.}\i{num}\b{.}\i{snapsn}[\b{.}\i{snapid}]}
Where \ci{snapsn} is the snapshot sequence number and \ci{snapid} is
the snapshot id. In case of \c{git}, \ci{snapsn} is the commit timestamp
in the \c{YYYYMMDDhhmmss} form and UTC timezone while \ci{snapid} is
a 12-character abbreviated commit id. For example:
\
1.2.3-a.1.20180319215815.26efe301f4a7
\
Notice also that a snapshot version is ordered \i{after} the corresponding
pre-release version. That is, \c{1.2.3-a.1\ <\ 1.2.3-a.1.1}. As a result, it
is customary to start the development of a new version with \c{X.Y.Z-a.0.z},
that is, a snapshot after the (non-existent) zero'th alpha release. \N{We will
explain the meaning of \cb{z} in this version momentarily.} The following
chronologically-ordered versions illustrate a typical release flow of a
project that uses \c{git} as its VCS:
\
0.1.0-a.0.19700101000000 # snapshot (no commits yet)
0.1.0-a.0.20180319215815.26efe301f4a7 # snapshot (first commit)
... # more commits/snapshots
0.1.0-a.1 # pre-release (first alpha)
0.1.0-a.1.20180319221826.a6f0f41205b8 # snapshot
... # more commits/snapshots
0.1.0-a.2 # pre-release (second alpha)
0.1.0-a.2.20180319231937.b701052316c9 # snapshot
... # more commits/snapshots
0.1.0-b.1 # pre-release (first beta)
0.1.0-b.1.20180319242038.c812163417da # snapshot
... # more commits/snapshots
0.1.0 # release
0.2.0-a.0.20180319252139.d923274528eb # snapshot (first in 0.2.0)
...
\
For a more detailed discussion of standard versioning and its support in
\c{build2} refer to \l{b#module-version \c{version} Module}.
Let's now see how this works in practice by publishing a couple of versions
for our \c{hello} project. By now it should be clear what that
\c{0.1.0-a.0.19700101000000} means \- it is the first snapshot version of our
project. Since there are no commits yet, it has the UNIX epoch as its commit
timestamp. Let's see what changes after we've made our first commit:
\
$ git add .
$ git commit -m \"Initial implementation\"
$ bdep status
hello configured 0.1.0-a.0.19700101000000
available 0.1.0-a.0.20180507062614.ee006880fc7e
\
Just like with changes to dependency information, \c{status} has detected that
a new (snapshot) version of our project is available for synchronization.
\N|Another way to view the project's version (which works even if we are
not using \c{bdep}) is with the build system's \c{info} meta-operation:
\
$ b info
project: hello
version: 0.1.0-a.0.20180507062614.ee006880fc7e
summary: hello executable project
...
\
|
Let's synchronize with the default build configuration:
\
$ bdep sync
synchronizing:
upgrade hello/0.1.0-a.0.20180507062614.ee006880fc7e
$ bdep status
hello configured 0.1.0-a.0.20180507062614.ee006880fc7e
\
\N|Notice that we didn't have to manually change the version anywhere. All we
had to do was commit our changes and a new snapshot version was automatically
derived by \c{build2} from the new \c{git} commit. Without this automation
continuous versioning would hardly be practical.|
If we now make another commit, we will see a similar picture:
\
$ bdep status
hello configured 0.1.0-a.0.20180507062614.ee006880fc7e
available 0.1.0-a.0.20180507062615.8fb9de05b38f
\
\N|Note that you don't need to manually run \c{sync} after every commit. As
discussed earlier, you can simply run the build system to update your project
and things will get automatically synchronized if necessary.|
Ok, time for our first release. Let's start with \c{0.1.0-a.1}. Unlike
snapshots, for pre-releases as well as final releases we have to change the
version in the \c{manifest} file:
\
version: 0.1.0-a.1
\
\N|The \c{manifest} file is the singular place where we specify the package
version. The build system's \l{b#module-version \c{version} module} makes it
available in various forms in buildfiles and even source code.|
To ensure continuous versioning, this change to version must be the last commit
for this (pre-)release which itself must be immediately followed by a second
change to the version starting the development of the next (pre-)release. We
also recommend that you tag the release commit with a tag name in the
\c{\b{v}\i{X}.\i{Y}.\i{Z}} form.
\N|Having regular release tag names with the \cb{v} prefix allows one to
distinguish them from other tags, for example, with wildcard patterns.|
Here is the release workflow for our example:
\
$ git commit -a -m \"Release version 0.1.0-a.1\"
$ git tag -a v0.1.0-a.1 -m \"Tag version 0.1.0-a.1\"
$ git push --follow-tags
# Version 0.1.0-a.1 is now public.
$ edit manifest # change 'version: 0.1.0-a.1.z'
$ git commit -a -m \"Change version to 0.1.0-a.1.z\"
$ git push
# Master is now open for business.
\
Notice also that when specifying a snapshot version in \c{manifest} we use the
special \cb{z} snapshot value (for example, \c{0.1.0-a.1.z}) which is
recognized and automatically replaced by \c{build2} with, in case of \c{git},
the current commit timestamp and id (refer to \l{b#module-version \c{version}
Module} for details).
While not particularly complicated, performing the release steps manually is
both tedious and error-prone. Instead, this process can be automated with the
\l{bdep-release(1)} command. Specifically, in its default mode, this command
will update the version in the \c{manifest} file, commit and tag this change,
open the next development cycle (again, by changing \c{manifest} and
committing), and, finally, if \c{--push} is specified, push everything to the
remote. So, instead of the above manual steps, we could have simply run:
\
$ bdep release --alpha --push
releasing:
package: hello
current: 0.1.0-a.0.z
release: 0.1.0-a.1
open: 0.1.0-a.1.z
commit: yes
tag: v0.1.0-a.1
push: origin/master
continue? [y/n] y
[master 82a7e65] Release version 0.1.0-a.1
[master e6cf3c0] Change version to 0.1.0-a.1.z
pushing branch master, tag v0.1.0-a.1
To github.com:john-doe/hello.git
26ec5c9..e6cf3c0 master -> master
* [new tag] v0.1.0-a.1 -> v0.1.0-a.1
\
\N|The \c{release} command has a number of alternative modes, such as for
releasing a package revision, as well as a number of options that control
which version will be released and which version will be opened. See
\l{bdep-release(1)} for details.|
Publishing the final release to the version control repository is exactly the
same. This time, however, let's see how we can also publish it to an
archive-based repository. The first step is again to make the release, which
we will do with the help of the \c{release} command. Except now we will delay
opening the next development cycle by passing \c{--no-open} (there is also no
\c{--alpha} since this is the final release):
\
$ bdep release --no-open --push
releasing:
package: hello
current: 0.1.0-a.1.z
release: 0.1.0
commit: yes
tag: v0.1.0
push: origin/master
continue? [y/n] y
[master 00ed45a] Release version 0.1.0
pushing branch master, tag v0.1.0
To github.com:john-doe/hello.git
5d5094c..00ed45a master -> master
* [new tag] v0.1.0 -> v0.1.0
\
To publish our project to an archive-based repository we use the
\l{bdep-publish(1)} command. For example:
\
$ bdep publish
publishing:
to: https://cppget.org
as: John Doe <john@example.org>
package: hello
version: 0.1.0
project: hello
section: alpha
control: https://github.com/john-doe/hello.git
continue? [y/n] y
pushing branch build2-control
submitting hello-0.1.0.tar.gz
############################################################# 100.0%
package submission is queued: https://queue.cppget.org/hello/0.1.0
reference: 0c596fca2017
\
Let's see what's going on here. By default \c{publish} submits to the
\l{https://cppget.org cppget.org} repository. On \c{cppget.org} package names
are assigned on a first come first serve basis. But instead of using logins or
emails to authenticate package ownership, \c{cppget.org} uses your version
control repository as a proxy. In a nutshell, when we submit a package for the
first time, its control repository is associated with its name and all
subsequent submissions have to use the same control repository (the
authentication part). When submitting a package, \c{publish} also adds a file
to the \c{build2-control} branch of the control repository with the package
archive checksum. On the other side, \c{cppget.org} checks for the presence of
this file to make sure that whomever is making this submission has write
access to the control repository (the authorization part). See
\l{bdep-publish(1)} for details.
The rest should be pretty straightforward: \c{publish} prepares and uploads a
distribution of our package which goes into the \c{alpha} section of the
repository (because it has \c{0} major version). In response we get a URL
which we can use to check the status of our submission on
\l{https://queue.cppget.org queue.cppget.org}. And after some basic testing
and verification, our package should appear on \c{cppget.org} (the exact steps
are described in \l{https://cppget.org?submit Submission Policies}). Note also
that package submissions to \c{cppget.org} are public and permanent and cannot
be removed under any circumstances.
Finally, we also shouldn't forget to increment the version for the next
development cycle. For that we can use the \c{--open} mode of the \c{release}
command. For example:
\
$ bdep release --open --push
opening:
package: hello
current: 0.1.0
open: 0.2.0-a.0.z
commit: yes
push: origin/master
continue? [y/n] y
[master ace2f6e] Change version to 0.2.0-a.0.z
pushing branch master
To github.com:john-doe/hello.git
00ed45a..ace2f6e master -> master
\
\N|One sticky point of continuous versioning is choosing the next version.
For example, above should we continue with \c{0.1.1-a.0}, \c{0.2.0-a.0},
or \c{1.0.0-a.0}? The important rule to keep in mind is that we can jump
forward to any further version at any time and without breaking continuous
versioning. But we can never jump backwards.
For example, we can start with \c{0.2.0-a.0} but if we later realize that this
will actually be a new major release, we can easily change it to
\c{1.0.0-a.0}. As a result, the general guideline is to start conservatively
by either incrementing the patch or the minor version component. And the
recommended strategy is to increment the minor component and, if required,
release patch versions from a separate branch (created by branching off from
the release commit). This is the default behavior of the \c{release} command.
Note also that you don't have to make any pre-releases if you don't need them.
While during development you would still keep the version as \c{X.Y.Z-a.0}, at
release you simply change it directly to the final \c{X.Y.Z}.|
When publishing the final release you may also want to clean up now
obsolete pre-release tags. For example:
\
$ git tag -l 'v0.1.0-*' | xargs git push --delete origin
$ git tag -l 'v0.1.0-*' | xargs git tag --delete
\
\N|While at first removing such tags may seem like a bad idea, pre-releases
are by nature temporary and their use only makes sense until the final release
is published.
Also note that having a \c{git} repository with a large number of published
but unused version tags may result in a significant download overhead.|
Let's also briefly discuss in which situations we should increment each of the
version components. While \i{semver} gives basic guidelines, there are several
ways to apply them in the context of C/C++ where there is a distinction
between binary and source compatibility. We recommend that you reserve
\i{patch} releases for specific bug fixes and security issues that you can
guarantee with a high level of certainty to be binary-compatible. Otherwise,
if the changes are source-compatible, increment \i{minor}. And if they are
breaking (that is, the user code likely will need adjustments), increment
\i{major}. During early development, when breaking changes are frequent, it is
customary to use the \c{0.Y.Z} versions where \c{Y} effectively becomes the
\i{major} component. Again, refer to the \l{b#module-version \c{version}
Module} for a more detailed discussion of this topic.
\h#guide-dev-multi|Developing Multiple Packages and Projects|
How does a library like \c{libhello} get developed? It's possible someone woke
up one day and realized that they were going to build a useful library that
everyone was going to use. But somehow this doesn't feel like how it really
works. In the real world things start organically: someone had a project like
\c{hello} and then needed the same functionality in another project. Or
someone else needed it and asked the author to factor it out into a
library. For this approach to work, however, moving such common functionality
into a library and then continue its parallel development must be a simple,
frictionless process. Let's see how this works in \c{build2}.
First, we need to decide whether to make \c{libhello} another package in our
\c{hello} project (that is, in the same \c{git} repository) or a separate
project (with a separate repository). Both arrangements are equally well
supported.
\N|A multi-package project works best if all the packages have the same
version and are released together. While the packages themselves can have
different versions (since each has its own \c{manifest}), in this scenario
following the release tagging recommendations discussed earlier will be
problematic.|
Let's start with a separate project since it is simpler. As the first step we
use \l{bdep-new(1)} to create a new library project next to our \c{hello}:
\
$ bdep new -t lib -l c++ libhello
created new library project libhello in /tmp/libhello/
$ ls
hello/
libhello/
hello-gcc/
hello-clang/
\
Let's also edit the generated \c{manifest} file and add the \c{project} value
(customarily after \c{version}) to indicate that our library belongs to the
same overall project as our executable:
\
$ cat libhello/manifest
: 1
name: libhello
version: 0.1.0-a.0.z
project: hello
summary: hello library
...
\
\N|The \c{project} value is used to group related packages together in order
to help with their organization and discovery. For example, if later we create
\c{libhello2} or \c{libhello-extra}, then it would make sense for them to also
belong to the \c{hello} project. See the \l{bpkg#manifest-package-project
\c{project}} value documentation for details.|
Our two projects will be sharing the same set of build configurations, so
next we initialize \c{libhello} in \c{hello-gcc} and \c{hello-clang}:
\
$ cd libhello
$ bdep init -A ../hello-gcc @gcc
initializing in project /tmp/libhello/
added configuration @gcc /tmp/hello-gcc/ default,auto-synchronized
synchronizing:
new libhello/0.1.0-a.0.19700101000000
$ bdep init -A ../hello-clang @clang
initializing in project /tmp/libhello/
added configuration @clang /tmp/hello-clang/ auto-synchronized
synchronizing:
new libhello/0.1.0-a.0.19700101000000
\
\N|If two or more projects share the same build configuration, then all of
them are always synchronized at once, regardless of the originating project.
It also makes sense to have the same default configuration and use identical
configuration names in all the projects.|
The last step is to move the desired functionality from \c{hello} to
\c{libhello} and at the same time add a dependency on \c{libhello}, just as we
did earlier (add a \c{depends} entry to \c{manifest}, then import the library
in \c{buildfile}, and so on). One interesting question is what to put as a
prerequisite repository in \c{repositories.manifest}. Our own setup will work
even if we don't put anything there \- the dependency will be automatically
resolved to our local version of \c{libhello} since we have initialized it in
all our build configurations. However, in case our \c{hello} repository is
used by someone else, it's a good idea to add the remote \c{git} repository
for \c{libhello} as a prerequisite.
\N|By now you have probably realized that our project directory is just
another type of package repository. See \l{bpkg-repository-types(1)} for
more information.|
And that's it, now we can build and test our new arrangement:
\
$ cd ../hello # back to hello project root
$ bdep test -i
c++ ../libhello/libhello/cxx{hello}
c++ ../libhello/tests/basics/cxx{driver}
c++ hello/cxx{hello}
ld ../hello-gcc/libhello/libhello/libs{hello}
ld ../hello-gcc/libhello/tests/basics/exe{driver}
ld ../hello-gcc/hello/hello/exe{hello}
test ../hello-gcc/libhello/tests/basics/exe{driver}
test hello/testscript{testscript} ../hello-gcc/hello/hello/exe{hello}
\
This is also the approach we would use if we wanted to fix a bug in someone
else's library. That is, we would clone their project repository and
initialize it in the build configurations of our project which will
\"upgrade\" the dependency to use the local version. Then we make the fix,
submit it upstream, and continue using the local version until our fix is
merged/published, at which point we deinitialize the project and switch
back to using the upstream version.
Let's now examine the second option: making \c{libhello} a package inside
\c{hello}. Here is the original structure of our \c{hello} project:
\
hello/
├── .git/
├── build/
├── hello/
│ ├── hello.cxx
│ └── buildfile
├── buildfile
├── manifest
└── repositories.manifest
\
As the first step, we move the \c{hello} program into its own subdirectory:
\
hello/
├── .git/
├── hello/
│ ├── build/
│ ├── hello/
│ │ ├── hello.cxx
│ │ └── buildfile
│ ├── buildfile
│ └── manifest
└── repositories.manifest
\
Next we again use \l{bdep-new(1)} to create a new library but this time
as a package inside an already existing project:
\
$ cd hello
$ bdep new --package -t lib -l c++ libhello
created new library package libhello in /tmp/hello/libhello/
\
Let's see what our project looks like now:
\
hello/
├── .git/
├── hello/
│ ├── ...
│ └── manifest
├── libhello/
│ ├── ...
│ └── manifest
├── packages.manifest
└── repositories.manifest
\
\N|Notice that, as discussed earlier, \c{repositories.manifest} belongs to
the project (repository) while \c{manifest} \- to the package.|
Besides the \c{libhello} directory the \c{new} command also created the
\c{packages.manifest} file in the root directory of our project. Let's take a
look inside:
\
$ cat packages.manifest
: 1
location: libhello/
\
Up until now our \c{hello} was a simple, single-package project that didn't
need this file \- \c{manifest} in its root directory was sufficient (see
\l{bpkg-repository-types(1)} for details on the project repository
structure). But now it contains several packages and we need to specify where
they are located within the project. So let's go ahead and add the location
of the \c{hello} package:
\
$ cat packages.manifest
: 1
location: libhello/
:
location: hello/
\
\N|Packages in a project can reside next to each other or in subdirectories
but they cannot nest. When published to an archive-based repository, each
such package will be placed into its own archive.|
Next we initialize the new package in all our build configurations:
\
$ cd libhello
$ bdep init -a
initializing in project /tmp/hello/
in configuration @gcc:
synchronizing:
upgrade hello/0.1.0-a.0.19700101000000#1
new libhello/0.1.0-a.0.19700101000000
in configuration @clang:
synchronizing:
upgrade hello/0.1.0-a.0.19700101000000#1
new libhello/0.1.0-a.0.19700101000000
\
\N|Notice that the \c{hello} package has been \"upgraded\" to reflect its
new location.|
Finally, as before, we move the desired functionality from \c{hello} to
\c{libhello} and at the same time add a dependency on \c{libhello}. Note,
however, that in this case we don't need to add anything to
\c{repositories.manifest} since both packages are in the same project
(repository). And that's it, now we can build and test our new arrangement:
\
$ cd .. # back to hello project root
$ bdep test
c++ libhello/libhello/cxx{hello}
c++ libhello/tests/basics/cxx{driver}
c++ hello/hello/cxx{hello}
ld ../hello-gcc/libhello/libhello/libs{hello}
ld ../hello-gcc/libhello/tests/basics/exe{driver}
ld ../hello-gcc/hello/hello/exe{hello}
test ../hello-gcc/libhello/tests/basics/exe{driver}
test hello/hello/testscript{testscript} ../hello-gcc/hello/hello/exe{hello}
\
\h#guide-consume-pkg|Package Consumption|
Ok, now that we have published a few releases of \c{hello}, how would the
users of our project get them? While they could clone the repository and use
\c{bdep} just like we did, this is more of a development rather than
consumption workflow. For consumption it is much easier to use the package
dependency manager, \l{bpkg(1)}, directly.
\N|Note that this approach also works for libraries in case you wish to use
them in a project with a build system other than \c{build2}. See
\l{#guide-unpackaged-deps Using Unpackaged Dependencies} for background on
cross-build system library consumption.|
First, we create a suitable build configuration with the
\l{bpkg-cfg-create(1)} command. We can use the same place for building all our
tools so let's call the directory \c{tools}. Seeing that we are only
interested in using (rather than developing) such tools, let's build them
optimized and also configure a suitable installation location:
\
$ bpkg create -d tools cc \
config.cxx=g++ \
config.cc.coptions=-O3 \
config.install.root=/usr/local \
config.install.sudo=sudo
created new configuration in /tmp/tools/
$ cd tools
\
The same step on Windows using Visual Studio would look like this (again,
remember to run this from the Visual Studio development command prompt):
\
$ bpkg create -d tools cc ^
config.cxx=cl ^
config.cc.coptions=/O2 ^
config.install.root= C:\install
\
To fetch and build packages (as well as all their dependencies) we use the
\l{bpkg-pkg-build(1)} command. We can use either an archive-based repository
like \l{https://cppget.org cppget.org} or build directly from \c{git}:
\
$ bpkg build hello@https://git.build2.org/hello/hello.git
fetching from https://git.build2.org/hello/hello.git
new libformat/1.0.0 (required by libhello)
new libprint/1.0.0 (required by libhello)
new libhello/1.1.0 (required by hello)
new hello/1.0.0
continue? [Y/n] y
configured libformat/1.0.0
configured libprint/1.0.0
configured libhello/1.1.0
configured hello/1.0.0
c++ libprint-1.0.0/libprint/cxx{print}
c++ hello-1.0.0/hello/cxx{hello}
c++ libhello-1.1.0/libhello/cxx{hello}
c++ libformat-1.0.0/libformat/cxx{format}
ld libprint-1.0.0/libprint/libs{print}
ld libformat-1.0.0/libformat/libs{format}
ld libhello-1.1.0/libhello/libs{hello}
ld hello-1.0.0/hello/exe{hello}
updated hello/1.0.0
\
\N|Passing a repository URL to the \c{build} command is a shortcut to the
following sequence of commands:
\
$ bpkg add https://git.build2.org/hello/hello.git # add repository
$ bpkg fetch # fetch package list
$ bpkg build hello # build package by name
\
|
Once built, we can install the package to the location that we have specified
with \c{config.install.root} using the \l{bpkg-pkg-install(1)} command:
\
$ bpkg install hello
...
install libformat-1.0.0/libformat/libs{format}
install libprint-1.0.0/libprint/libs{print}
install libhello-1.1.0/libhello/libs{hello}
install hello-1.0.0/hello/exe{hello}
$ hello World
Hello, World!
\
\N|If on your system the installed executables don't run from \c{/usr/local}
because of the unresolved shared libraries (or if you are installing somewhere
else, such as \c{/opt}), then the easiest way to fix this is with \i{rpath}.
Simply add the following configuration variable when creating the build
configuration (or as an argument to the \c{install} command):
\
config.bin.rpath=/usr/local/lib
\
Note to Windows users: this is not an issue on this platform since executables
and shared (DLL) libraries are installed into the same subdirectory (\c{bin})
of the installation directory.|
The installation contents and layout under \c{config.install.root} would be
along these lines:
\
/usr/local/
├── bin/
│ └── hello
├── include/
│ ├── libformat/
│ │ ├── export.hxx
│ │ ├── format.hxx
│ │ └── version.hxx
│ ├── libhello/
│ │ ├── export.hxx
│ │ ├── hello.hxx
│ │ └── version.hxx
│ └── libprint/
│ ├── export.hxx
│ ├── print.hxx
│ └── version.hxx
├── lib/
│ ├── libformat-1.0.so
│ ├── libformat.so -> libformat-1.0.so
│ ├── libhello-1.1.so
│ ├── libhello.so -> libhello-1.1.so
│ ├── libprint-1.0.so
│ ├── libprint.so -> libprint-1.0.so
│ └── pkgconfig/
│ ├── libformat.shared.pc
│ ├── libhello.shared.pc
│ └── libprint.shared.pc
└── share/
└── doc/
├── libformat/
│ └── manifest
├── libhello/
│ └── manifest
└── libprint/
└── manifest
\
\N|The installation locations of various types of files (executables,
libraries, headers, documentation, etc) can be customized using a number of
the \c{config.install.*} variables with the most commonly used ones and their
defaults (relative to \c{config.install.root}) listed below (see the
\c{install} build system module documentation for the complete list).
\
config.install.bin = root/bin/
config.install.lib = root/lib/
config.install.doc = root/share/doc/
config.install.man = root/share/man/
config.install.include = root/include/
\
|
If we need to uninstall a previously installed package, there is the
\l{bpkg-pkg-uninstall(1)} command:
\
$ bpkg uninstall hello
uninstall hello-1.0.0/hello/exe{hello}
uninstall libhello-1.1.0/libhello/libs{hello}
uninstall libprint-1.0.0/libprint/libs{print}
uninstall libformat-1.0.0/libformat/libs{format}
...
\
To upgrade or downgrade packages we again use the \c{build} command. Here
is a typical upgrade workflow:
\
$ bpkg fetch # refresh available package list
$ bpkg status # see if new versions are available
$ bpkg uninstall hello # uninstall old version
$ bpkg build hello # upgrade to the latest version
$ bpkg install hello # install new version
\
Similar to \c{bdep}, to downgrade we have to specify the desired version
explicitly. There are also the \c{--upgrade|-u} and \c{--patch|-p} as well as
\c{--immediate|-i} and \c{--recursive|-r} options that allow us to upgrade or
patch packages that we have built and/or their immediate or all dependencies
(see \l{bpkg-pkg-build(1)} for details). For example, to make sure everything
is patched, run:
\
$ bpkg fetch
$ bpkg build -pr
\
If a package is no longer needed, we can remove it from the configuration with
\l{bpkg-pkg-drop(1)}:
\
$ bpkg drop hello
following dependencies were automatically built but
will no longer be used:
libhello
libformat
libprint
drop unused packages? [Y/n] y
drop hello
drop libhello
drop libformat
drop libprint
continue? [Y/n] y
purged hello
purged libhello
purged libformat
purged libprint
\
\h#guide-system-deps|Using System-Installed Dependencies|
Our operating system might already have a package manager (which we will refer
to as \i{system package manager}) and for various reasons we may want to use
the system-installed version of a dependency rather than building one from
source.
\N|Using system-installed versions works best for mature rather than
rapidly-developed packages since for the latter you often need to track the
latest version (which may not yet be available from the system repository)
and/or test with multiple versions (which is not something that many system
package managers support).|
We can instruct \c{build2} to configure a dependency package as available from
the system rather than building it from source. Let's see how this works in an
example. Say, we want to use \l{https://cppget.org/libsqlite3 \c{libsqlite3}}
in our \c{hello} project.
The first step is to add it as a dependency, just like we did for \c{libhello}.
That is, add another \c{depends} entry to \c{manifest}, then import it in
\c{buildfile}, and so on.
\N|Note that the dependency still has to be packaged and available from one of
the project's prerequisite repositories. However, it can be a \i{stub} \- a
package that does not contain any source code and that can only be \"obtained\"
from the system (see \l{bpkg#package-version Package Version} for
details). See also \l{#guide-unpackaged-deps Using Unpackaged Dependencies}
for how to deal with dependencies that are not packaged.|
Now, if we just run \c{sync} or try to build our project, \c{build2} will
download and build the new dependency from source, just like it did for
\c{libhello}. Instead, we can issue an explicit \c{sync} command that
configures the \c{libsqlite3} package as coming from the system:
\
$ bdep sync ?sys:libsqlite3
synchronizing:
configure sys:libsqlite3/*
upgrade hello/0.1.0-a.0.19700101000000#3
\
Here \cb{?} is a package \i{flag} that instructs \c{build2} to treat it as a
dependency and \cb{sys} is a package \i{scheme} that tells \c{build2} it comes
from the system. See \l{bpkg-pkg-build(1)} for details.
\N|We can have some build configurations using a system-installed version of
a dependency while others building it from source, for example, for testing.|
\N|The system-installed dependency doesn't really have to come from the system
package manager. It can also be manually installed and, as discussed in
\l{#guide-unpackaged-deps Using Unpackaged Dependencies}, not necessarily into
the system-default location like \c{/usr/local}.|
\N|Currently, unless we specify the installed version explicitly, a
system-installed package is assumed to satisfy any dependency constraint. In
the future, \c{build2} will automatically query commonly used system package
managers for the installed version and maybe even request installation of the
absent packages. To support this functionality, the package manifest may need
to specify package name mappings for various system package managers (which is
the rationale behind stub packages).|
\h#guide-unpackaged-deps|Using Unpackaged Dependencies|
Generally, we will have a much better time if all our dependencies come as
\c{build2} packages. Unfortunately, this won't always be the case in the real
world and some libraries that you may need will use other build systems.
\N|There is also the opposite problem: you may want to consume a library that
uses \c{build2} in a project that uses a different build system. For that
refer to \l{#guide-consume-pkg Package Consumption}.|
The standard way to consume such unpackaged libraries is to install them (not
necessarily into a system-default location like \c{/usr/local}) so that we
have a single directory with their headers and a single directory with their
libraries. We can then configure our builds to use these directories when
searching for imported libraries.
\N|Needless to say, none of the \c{build2} dependency management mechanisms
such as version constraints or upgrade/downgrade will work on such unpackaged
libraries. You will have to manage all these yourself manually.|
Let's see how this all works in an example. Say, we want to use \c{libextra}
that uses a different build system in our \c{hello} project. The first step
is to manually build and install this library for each build configuration
that we have. For example, we can install all such unpackaged libraries into
\c{unpkg-gcc} and \c{unpkg-clang}, next to our \c{hello-gcc} and
\c{hello-clang} build configurations:
\
$ ls
hello/
hello-gcc/
unpkg-gcc/
hello-clang/
unpkg-clang/
\
\N|If you would like to try this out but don't have a suitable \c{libextra},
you can create and install one with these commands:
\
$ bdep new -t lib -l c++ libextra -C libextra-gcc cc config.cxx=g++
$ b install: libextra-gcc/ config.install.root=/tmp/unpkg-gcc
\
|
If we look inside one of these \c{unpkg-*} directories, we should see
something like this:
\
$ tree unpkg-gcc
unpkg-gcc/
├── include/
│ └── libextra/
│ └── extra.hxx
└── lib/
├── libextra.a
├── libextra.so
└── pkgconfig/
└── libextra.pc
\
Notice that \c{libextra.pc} \- it's a \cb{pkg-config(1)} file that contains
any extra compile and link options that may be necessary to consume this
library. This is the \i{de facto} standard for build systems to communicate
library build information to each other and is today supported by most
commonly used implementations. Speaking of \c{build2}, it both recognizes
\c{.pc} files when consuming third-party libraries and automatically produces
them when installing its own.
\N|While this may all seem foreign to Windows users, there is nothing
platform-specific about this approach, including support for \c{pkg-config},
which, at least in case of \c{build2}, works equally well on Windows.|
Next, we create a build configuration and configure it to use one of these
\c{unpkg-*} directories (replace \c{...} with the absolute path):
\
$ bdep init -C ../hello-gcc @gcc cc config.cxx=g++ \
config.cc.poptions=-I.../unpkg-gcc/include \
config.cc.loptions=-L.../unpkg-gcc/lib
\
\N|If using Visual Studio, replace \c{-I} with \c{/I} and \c{-L} with
\c{/LIBPATH:}.|
Alternatively, if you want to reconfigure one of the existing build
configurations, then simply edit the \c{build/config.build} file (that is,
\c{hello-gcc/build/config.build} in our case) and adjust the \c{poptions} and
\c{loptions} values. Or you can use the build system directly to reconfigure
the build configuration (see \l{b(1)} for details):
\
b configure: ../hello-gcc/ \
config.cc.poptions+=-I.../unpkg-gcc/include \
config.cc.loptions+=-L.../unpkg-gcc/lib
\
\N|If all the unpackaged libraries included \c{.pc} files, then the \c{-L}
alone would have been sufficient. However, it doesn't hurt to also add
\c{-I}, for good measure.|
Once this is done, adjust your \c{buildfile} to import the library:
\
import libs += libextra%lib{extra}
\
And your source code to use it:
\
#include <libextra/extra.hxx>
\
\N|Notice that we don't add the corresponding \c{depends} value to the
project's \c{manifest} since this library is not a package. However, it is a
good idea to instead add a \l{bpkg#manifest-package-requires \c{requires}}
entry as a documentation to users of our project.|
\h1#proj-struct|Canonical Project Structure|
The goal of establishing a canonical project structure is to create an
ecosystem of packages that can coexist, are easy to comprehend by both humans
and tools, scale to complex, real-world requirements, and, last but not least,
are pleasant to work with.
The canonical structure is primarily meant for a package \- a single library
or program (or, sometimes, a collection of related programs) with a specific
and well-defined function. While it may be less suitable for more elaborate,
multi-library/program \i{end-products} that are not meant to be packaged, most
of the recommendations discussed below would still apply. Oftentimes, you
would start with a canonical project and expand from there. Note also that
while the discussion below focuses on C++, most of it applies equally to C
projects.
\N|We often find ourselves factoring common functionality out of such
end-products and into separate packages, for example, in order to be reused in
another end-product). In this light, it can be helpful to start a new
end-product project as a composition of individual packages that follow the
canonical structure.|
Projects created by the \l{bdep-new(1)} command have the canonical structure.
The overall layouts for executable (\c{-t\ exe}) and library (\c{-t\ lib})
projects are presented below.
\
<name>/
├── build/
├── <name>/
│ ├── <name>.cxx
│ ├── <name>.test.cxx
│ ├── testscript
│ └── buildfile
├── buildfile
└── manifest
\
\
lib<name>/
├── build/
├── lib<name>/
│ ├── <name>.hxx
│ ├── <name>.cxx
│ ├── <name>.test.cxx
│ ├── export.hxx
│ ├── version.hxx.in
│ └── buildfile
├── tests/
├── buildfile
└── manifest
\
The canonical structure for both project types is discussed in detail in
the following sections with a short summary of the key points presented
below.
\ul|
\li|\n\n\i{Header and source files (or module interface and implementation
files) are next to each other (no \c{include/} and \c{src/} split).}|
\li|\n\i{Headers are included with \c{<>} and contain the project directory
prefix, for example, \c{<libhello/hello.hxx>}.}|
\li|\n\i{Header and source file extensions are either \c{.hpp/.cpp} or
\c{.hxx/.cxx} (\c{.mpp} or \c{.mxx} for module interfaces).}|
\li|\n\i{No special characters other than \c{_} and \c{-} in file names with
\c{.} only used for extensions.}\n\n||
Let's start with naming our projects: A project name should only contain ASCII
alphabetic characters (\c{[a-zA-Z]}), digits (\c{[0-9]}), underscores (\c{_}),
plus/minus (\c{+-}), and dots (\c{.}) as well as be at least two characters
long (see \l{bpkg#package-name Package Name} for additional restrictions and
recommendations).
If a project consists of a library and an executable, then they should be
split into separate packages (see \l{#guide-dev-multi Developing Multiple
Packages and Projects} for some common arrangements). In this case, by
convention, the library name should start with the \c{lib} prefix, for
example, \c{libhello} and \c{hello}. It is also strongly recommended (but not
required) to follow this convention in new projects, even if there are no
plans to have a related executable.
\N|Using the \c{lib} prefix consistently offers several benefits:
\ol|
\li|\nIt is clear from the name to both humans and tools what kind of project
it is.|
\li|\nAll libraries are consistently named (as opposed to some with the
\c{lib} prefix and some without).|
\li|\nAll library names are future-proofed to co-exist with executables. If
one starts with a library without the \c{lib} prefix but later decides to add
an executable, renaming the library would unlikely be an option. And there is
no need to spend mental energy on thinking whether it's possible that an
executable will be added later.||
|
The project's root directory should contain the root \c{buildfile} and package
\c{manifest} file. Other recommended top-level subdirectory names are
\c{examples/} (for libraries it is normally a subproject like \c{tests/}, as
discussed below), \c{doc/}, and \c{etc/} (sample configurations, scripts,
third-party contributions, etc). See also build system \l{b#intro-proj-struct
Project Structure} for details on the build-related files (\c{buildfile}) and
subdirectories (\c{build/}) as well as the available alternative naming
scheme.
\h#proj-struct-src-dir|Source Directory|
The project's source code is placed into a subdirectory of the root directory
named the same as the project, for example, \c{hello/hello/} or
\c{libhello/libhello/}. It is called the project's \i{source subdirectory}.
There are several reasons for this layout: It implements the canonical
inclusion scheme (discussed below) where each header is prefixed with its
project name. It also has a predictable name where users (and tools) can
expect to find our project's source code. Finally, this layout prevents
clutter in the project's root directory which usually contains various other
files (like \c{README}, \c{LICENSE}) and directories (like \c{doc/},
\c{tests/}, \c{examples/}).
\N|Another popular approach is to place public headers into the \c{include/}
subdirectory and source files as well as private headers into \c{src/}. The
cited advantage of this layout is the predictable location (\c{include/}) that
contains only the project's public headers (that is, its API). This can make
the project easier to navigate and understand while harder to misuse, for
example, by including a private header.
However, this split layout is not without drawbacks:
\ul|
\li|Navigating between corresponding headers and sources is cumbersome. This
affects editing, grep'ing, as well as code browsing (for example, on GitHub).|
\li|Implementing the canonical inclusion scheme would require an extra level
of subdirectories (for example, \c{include/libhello/} and \c{src/libhello/}),
which only amplifies the previous issue.|
\li|Supporting generated source code can be challenging: Source code
generators rarely provide support for writing headers and sources into
different directories. Even if we can move things around post-generation,
build systems may not support this arrangement (for example, \c{build2} does
not currently support target groups with members in different directories).||
Also, the stated advantage of this layout \- separation of public headers from
private \- is not as clear cut as it may seem at first. The common assumption
of the split layout is that only headers from \c{include/} are installed and,
conversely, to use the headers in-place, all one has to do is add \c{-I}
pointing to \c{include/}. On the other hand, it is common for public headers
to include private, for example, to call an implementation detail function in
inline or template code (note that the same applies to private modules
imported in public module interfaces). Which means such private, (or, probably
now more accurately called \i{implementation detail}) headers have to be
placed in the \c{include/} directory as well, perhaps into a subdirectory
(such as \c{details/}) or with a file name suffix (such as \c{-impl}) to
signal to the user that they are still \"private\". Needless to say, in an
actively developed project, keeping track of which private headers can still
stay in \c{src/} and which have to be moved to \c{include/} (and vice versa)
is a tedious, error-prone task. As a result, practically, the split layout
quickly degrades into the \"all headers in \c{include/}\" arrangement which
negates its main advantage.
It is also not clear how the split layout will translate to modularized
projects. With modules, both the interface and implementation (including
non-inline/template function definitions) can reside in the same file with a
substantial number of C++ developers finding this arrangement appealing. If a
project consists of only such single-file modules, then \c{include/} and
\c{src/} have effectively become the same thing (note that there couldn't be
any \"private\" modules in \c{src/} since there would be nobody to import
them). In a sense, we already have this situation with header-only libraries
except that in the case of modules calling the directory \c{include/} would be
an anachronism.
To summarize, the split directory arrangement offers little benefit over the
single directory layout, has a number of real drawbacks, and does not fit
modularized projects well. In practice, private headers are placed into
\c{include/}, often either in a subdirectory or with a special file name
suffix, a mechanism that is readily available in the single directory layout.|
All headers within a project should be included using the \c{<>} style
inclusion and contain the project name as a directory prefix. And all headers
means \i{all headers} \- public, private, or implementation detail, in
executables or in libraries.
As an example, let's say we've added \c{utility.hxx} to our \c{hello} project.
This is how it should be included in \c{hello.cxx}:
\
// #include \"utility.hxx\" // Wrong.
// #include <utility.hxx> // Wrong.
// #include \"../hello/utility.hxx\" // Wrong.
#include <hello/utility.hxx>
\
Similarly, if we want to include \c{hello.hxx} from \c{libhello}, then the
inclusion should look like this:
\
#include <libhello/hello.hxx>
\
\N|The problem with the \c{\"\"} style inclusion is if the header is not found
relative to the including file, most compilers will continue looking for it in
the include search paths, the same as for \c{<>}. As a result, if the header
is not present in the right place (for example, because it was mistakenly not
listed as to be installed), chances are that a completely unrelated header
with the same name will be found and included. Needless to say, debugging
situations like these is unpleasant.
Prefixing all inclusions with the project name also makes sure that headers
with common names (for example, \c{utility.hxx}) can coexist (for example,
when installed into a system-wide directory, such as \c{/usr/include}). The
prefix also plays an important role in supporting auto-generated headers.
Note also that this header inclusion scheme is consistent with the module
importation, for example:
\
import hello.utility;
\
Finally, note that while adding the project prefix to the \c{\"\"} style
inclusion (for example, \c{\"libhello/hello.hxx\"}) will make finding an
unrelated header unlikely, there is still a possibility. And it is not clear
why take the chance when there are no benefits. So let's imagine the \c{\"\"}
style inclusion does not exist and we will all have a much better time.|
If you have to disregard every rule and recommendation in this section but
one, for example, because you are working on an existing library, then insist
on this: \b{public header inclusions must use the library name as a directory
prefix}.
The project's source subdirectory can have subdirectories of its own, for
example, to organize the code into components. Naturally, header inclusions
will need to contain such subdirectories, for example
\c{<libhello/core/hello.hxx>}. When the project's headers are installed (for
example, into \c{/usr/include}), this subdirectory hierarchy is
automatically recreated.
If you would like to separate public API headers/modules from implementation
details, the convention is to place them into the \c{details/}
subdirectory. For example:
\
libhello/
└── libhello/
├── details/
│ └── utility.hxx
└── ...
\
\N|If a project has truly private headers (for example, proprietary code) that
must be clearly separated from public and implementation detail headers, then
they can be placed into the \c{private/} subdirectory, next to
\c{details/}. In a sense, this arrangement mimics the C++
public/protected/private member access.|
It is recommended that you still install the implementation detail headers
and modules for the reasons discussed above. If, however, you would like to
disable their installation, you can add the following line to your source
subdirectory \c{buildfile}:
\
details/hxx{*}: install = false
\
\N|If you are creating a \i{family of libraries} with a common name prefix,
then it may make sense to use a nested source directory layout with a common
top-level directory. As an example, let's say we have the \c{libstud-path} and
\c{libstud-url} libraries that belong to the same \c{libstud} family. Their
source subdirectory layouts could look like this:
\
libstud-path/
└── libstud/
└── path/
├── path.hxx
├── path-io.hxx
├── ...
└── buildfile
libstud-url/
└── libstud/
└── url/
├── url.hxx
├── url-io.hxx
├── ...
└── buildfile
\
With the header inclusion paths adjusted accordingly:
\
#include <libstud/path/path.hxx>
#include <libstud/url/url.hxx>
\
|
\h#proj-struct-src-name|Source Naming|
When naming source files, only use ASCII alphabetic characters, digits, as
well as \c{_} (underscore) and \c{-} (minus). Use \c{.} (dot) only for
extensions, that is, trailing parts of the name that \i{classify} your files.
Examples of good names:
\
SmallVector.hxx
small-vector.hxx
small_vector.hxx
small-vector.test.cxx
\
Examples of bad names:
\
small+vector.hxx
small.vector.hxx
\
\N|If you are using \c{_} or \c{-} as word separators in filesystem names,
pick one and use it consistently throughout the project.|
The C source file extensions are always \c{.h}/\c{.c}. The two alternative C++
source file extension schemes are \c{.?pp} and \c{.?xx}:
\
file .?pp .?xx
header .hpp .hxx
module .mpp .mxx
inline .ipp .ixx
template .tpp .txx
source .cpp .cxx
\
\N|The \c{.mxx}/\c{.mpp} extension is for the module interface translation
units with module implementation units (if any) using the \c{.cxx}/\c{.cpp}
extension. If both are present, then it makes sense to use the same base name,
similar to headers. For example:
\
hello-core.mxx
hello-core.cxx
\
|
The use of inline and template files is a matter of taste. If used, they are
included at the end of the header/module files and contain definitions of
inline and non-inline template functions, respectively. The \c{.?xx}/\c{.?pp}
files with the same name (or, sometimes, name prefix) are assumed to be
related and are collectively called a \i{module}. \N{This term is meant to
correspond directly to a C++ module.}
By default the \l{bdep-new(1)} command uses the naming \c{.?xx} scheme. To use
\c{.?pp} instead, pass \c{-t\ c++,cpp}.
\N|There are several reasons not to \"reuse\" the \c{.h} C header extension
for C++ files:
\ul|
\li|There can be a need for both C and C++ headers for the same module.|
\li|It allows tools to accurately determine the language from the file name.|
\li|It is easier to search for C++ source code using wildcard patterns
(\c{*.?pp}).||
The last two reasons are also why headers without extensions are probably not
worth the trouble.|
Source files corresponding to C++ modules need to embed a sufficient amount of
\"module name tail\" in their names to unambiguously resolve all the modules
used in a project. When deriving file names from C++ module names, \c{.} (dot)
should be replaced with either \c{_} (underscore), \c{-} (minus), a case
change, or a directory separator, according to your project's file naming
scheme. For example, if our \c{libhello} had two modules, \c{hello.core} and
\c{hello.extra}, then their interface units could be named as follows:
\
hello-core.mxx
hello-extra.mxx
hello_core.mxx
hello_extra.mxx
HelloCore.mxx
HelloExtra.mxx
hello/core.mxx
hello/extra.mxx
core.mxx
extra.mxx
\
As discussed in the next section, public module names should start with the
project name and for such modules it is customary to omit this first component
from file names (the last variant in the above example). See also
\l{b#cxx-modules-build Building Modules} for a more detailed discussion of
the module name to file name mapping.
\h#proj-struct-src-content|Source Contents|
Let's now move inside our source files. All macros defined by a project, such
as include guards, version and symbol export macros, etc., must all start with
the project name (including the \c{lib} prefix for libraries), for example
\c{LIBHELLO_VERSION}. Similarly, the library's namespace and module names
(both public and implementation detail) should all start with the library name
but without the \c{lib} prefix. For example:
\
// libhello/hello.mxx
export module hello.core;
namespace hello
{
...
}
\
An executable project may use a namespace (in which case it is natural to call
it after the project) and its (private) modules shouldn't be qualified with
the project name (in order not to clash with similarly named modules from the
corresponding library, if any). \N{A library may also have private modules in
which case they shouldn't be qualified either.}
\N|Hopefully by now the recommendation for the \c{lib} prefix should be easy
to understand: oftentimes executables and libraries come in pairs, for example
\c{hello} and \c{libhello}, with the reusable functionality being factored out
from the executable into the library. It is natural to want to use the same
name \i{stem} (\c{hello} in our case) for both.
The above naming scheme (with the \c{lib} prefix present in some names but not
others) is carefully chosen to allow such library/executable pairs to coexist
and be used together without too much friction. For example, both the library
and executable can have a header called \c{utility.hxx} with the executable
being able to include both and even get the \"merged\" functionality without
extra effort (since they use the same namespace):
\
// hello/hello.cxx
#include <hello/utility.hxx>
#include <libhello/utility.hxx>
namespace hello
{
// Contains names from both utilities.
}
\
|
A canonical library project contains two special headers: \c{export.hxx} (or
\c{export.hpp}) that defines the library's symbol exporting macro as well as
\c{version.hxx} (or \c{version.hpp}) that defines the library's version macros
(see \l{b#module-version \c{version} Module} for details).
\h#proj-struct-tests|Tests|
A project may have \i{unit} and/or \i{functional/integration} tests. Unit
tests exercise each module's (potentially private) functionality in
isolation. In contrast, functional/integration tests exercise the project via
its public API, just like the real users of the project would.
A source file that implements a module's unit tests should be placed next to
that module's files and be called with the module's name plus the \c{.test}
second-level extension. It is expected to implement an executable (that is,
define \c{main()}). If a module uses Testscript for unit testing, then the
corresponding file should be called with the module's name plus the
\c{.test.testscript} extension. For example:
\
libhello/
└── libhello/
├── hello.hxx
├── hello.cxx
├── hello.test.cxx
└── hello.test.testscript
\
\N|All source files (that is, headers, modules, etc) with the \c{.test}
second-level extension are assumed to belong to unit tests and are
automatically excluded from the library/executable sources.|
A library's functional/integration tests should go into the \c{tests/}
subdirectory. Each such test should reside in a separate subdirectory,
potentially organized into nested subdirectories (for instance, to correspond
to the source directory components). For example, if we were creating an XML
parsing and serialization library, then our \c{tests/} could have the
following layout:
\
tests/
├── basics/
│ ├── driver.cxx
│ └── buildfile
├── parser/
│ ├── pull/
│ │ ├── driver.cxx
│ │ └── buildfile
│ └── push/
│ ├── driver.cxx
│ └── buildfile
└── serializer/
└── ...
\
In the canonical library project created by \c{bdep-new} the \c{tests/}
subdirectory is an unnamed subproject (in the build system terms). This allows
us to build and run tests against an installed version of the library (see
\l{b#intro-operations-test Testing} for more information on the contents of
this directory).
\N|The \c{build2} CI implementation will automatically perform the
installation test if a project contains the \c{tests/} subproject. See
\c{bbot} \l{bbot#arch-worker Worker Logic} for details.|
By default executable projects do not have the \c{tests/} subprojects instead
placing integration tests next to the source code (the \c{testscript} file;
see \l{testscript The build2 Testscript Language} for details). However, if
desired, executable projects can have the \c{tests/} subproject, the same as
libraries.
\N|By default projects created by \c{bdep-new} include support for
functional/integration testing but exclude support for unit testing. These
defaults, however, can be overridden with \c{no-tests} and \c{unit-tests}
options, respectively. For example:
\
$ bdep new -t lib,unit-tests -l c++ libhello
\
The rationale behind these defaults is that if a functionality can be tested
through the public API, then we should generally prefer integration to unit
testing. And in simple projects the entire functionality is often exposed
through the public API. At the same time, support for unit testing adds extra
complexity to the build infrastructure. Note also that it is fairly
straightforward to add support for unit testing at a later stage. The relevant
build logic is localized in the source subdirectory \c{buildfile} so you can
simply generate a new project with unit tests enabled and copy over the
relevant parts.|
\h#proj-struct-build-out|Build Output|
There are no \c{bin/} or \c{obj/} subdirectories: build output (object files,
libraries, executables, etc) go into a parallel directory structure (in case
of an out of source build) or next to the sources (in case of an in source
build). See \l{b#intro-dirs-scopes Output Directories and Scopes} for details
on in and out of source builds.
Projects managed with \l{bdep(1)} are always built out-of-source. However, by
default, the source directory is configured as \i{forwarded} to one of the
out-of-source builds. This has two effects: we can run the build system driver
\l{b(1)} directly in the source directory and certain \"interesting\" targets
(such as executables, documentation, test results, etc) will be automatically
\i{backlinked} to the source directory (see \l{b#intro-operations-config
Configuration} for details on forwarded configurations). The following listing
illustrates this setup for our \c{hello} project (executables are marked with
\c{*}):
\
hello-gcc/
hello/ ~~> └── hello/
├── build/ ~~> ├── build/
└── hello/ ~~> └── hello/
├── hello.cxx ├── hello.o
└── hello --> └── *hello
\
The result is an \i{as-if} in-source build with all the benefits (such as
having both source and relevant output in the same directory) but without any
of the drawback (such as the inability to have multiple builds or source
directory cluttered with object files).
\N|The often cited motivation for placing executables into \c{bin/} is that in
many build systems it is the only way to make things runnable in a reasonably
cross-platform manner. The major drawback of this arrangement is the need for
unique executable names which is especially constraining when writing tests
where it is convenient to call the executable just \c{driver} or \c{test}.
In \c{build2} there is no such restriction and all executables can run
\i{in-place}. This is achieved with \c{rpath} which is emulated with DLL
assemblies on Windows.|
"
|