head 1.9; access; symbols pkgsrc-2026Q1:1.9.0.36 pkgsrc-2026Q1-base:1.9 pkgsrc-2025Q4:1.9.0.34 pkgsrc-2025Q4-base:1.9 pkgsrc-2025Q3:1.9.0.32 pkgsrc-2025Q3-base:1.9 pkgsrc-2025Q2:1.9.0.30 pkgsrc-2025Q2-base:1.9 pkgsrc-2025Q1:1.9.0.28 pkgsrc-2025Q1-base:1.9 pkgsrc-2024Q4:1.9.0.26 pkgsrc-2024Q4-base:1.9 pkgsrc-2024Q3:1.9.0.24 pkgsrc-2024Q3-base:1.9 pkgsrc-2024Q2:1.9.0.22 pkgsrc-2024Q2-base:1.9 pkgsrc-2024Q1:1.9.0.20 pkgsrc-2024Q1-base:1.9 pkgsrc-2023Q4:1.9.0.18 pkgsrc-2023Q4-base:1.9 pkgsrc-2023Q3:1.9.0.16 pkgsrc-2023Q3-base:1.9 pkgsrc-2023Q2:1.9.0.14 pkgsrc-2023Q2-base:1.9 pkgsrc-2023Q1:1.9.0.12 pkgsrc-2023Q1-base:1.9 pkgsrc-2022Q4:1.9.0.10 pkgsrc-2022Q4-base:1.9 pkgsrc-2022Q3:1.9.0.8 pkgsrc-2022Q3-base:1.9 pkgsrc-2022Q2:1.9.0.6 pkgsrc-2022Q2-base:1.9 pkgsrc-2022Q1:1.9.0.4 pkgsrc-2022Q1-base:1.9 pkgsrc-2021Q4:1.9.0.2 pkgsrc-2021Q4-base:1.9 pkgsrc-2021Q3:1.7.0.48 pkgsrc-2021Q3-base:1.7 pkgsrc-2021Q2:1.7.0.46 pkgsrc-2021Q2-base:1.7 pkgsrc-2021Q1:1.7.0.44 pkgsrc-2021Q1-base:1.7 pkgsrc-2020Q4:1.7.0.42 pkgsrc-2020Q4-base:1.7 pkgsrc-2020Q3:1.7.0.40 pkgsrc-2020Q3-base:1.7 pkgsrc-2020Q2:1.7.0.36 pkgsrc-2020Q2-base:1.7 pkgsrc-2020Q1:1.7.0.16 pkgsrc-2020Q1-base:1.7 pkgsrc-2019Q4:1.7.0.38 pkgsrc-2019Q4-base:1.7 pkgsrc-2019Q3:1.7.0.34 pkgsrc-2019Q3-base:1.7 pkgsrc-2019Q2:1.7.0.32 pkgsrc-2019Q2-base:1.7 pkgsrc-2019Q1:1.7.0.30 pkgsrc-2019Q1-base:1.7 pkgsrc-2018Q4:1.7.0.28 pkgsrc-2018Q4-base:1.7 pkgsrc-2018Q3:1.7.0.26 pkgsrc-2018Q3-base:1.7 pkgsrc-2018Q2:1.7.0.24 pkgsrc-2018Q2-base:1.7 pkgsrc-2018Q1:1.7.0.22 pkgsrc-2018Q1-base:1.7 pkgsrc-2017Q4:1.7.0.20 pkgsrc-2017Q4-base:1.7 pkgsrc-2017Q3:1.7.0.18 pkgsrc-2017Q3-base:1.7 pkgsrc-2017Q2:1.7.0.14 pkgsrc-2017Q2-base:1.7 pkgsrc-2017Q1:1.7.0.12 pkgsrc-2017Q1-base:1.7 pkgsrc-2016Q4:1.7.0.10 pkgsrc-2016Q4-base:1.7 pkgsrc-2016Q3:1.7.0.8 pkgsrc-2016Q3-base:1.7 pkgsrc-2016Q2:1.7.0.6 pkgsrc-2016Q2-base:1.7 pkgsrc-2016Q1:1.7.0.4 pkgsrc-2016Q1-base:1.7 pkgsrc-2015Q4:1.7.0.2 pkgsrc-2015Q4-base:1.7 pkgsrc-2015Q3:1.6.0.32 pkgsrc-2015Q3-base:1.6 pkgsrc-2015Q2:1.6.0.30 pkgsrc-2015Q2-base:1.6 pkgsrc-2015Q1:1.6.0.28 pkgsrc-2015Q1-base:1.6 pkgsrc-2014Q4:1.6.0.26 pkgsrc-2014Q4-base:1.6 pkgsrc-2014Q3:1.6.0.24 pkgsrc-2014Q3-base:1.6 pkgsrc-2014Q2:1.6.0.22 pkgsrc-2014Q2-base:1.6 pkgsrc-2014Q1:1.6.0.20 pkgsrc-2014Q1-base:1.6 pkgsrc-2013Q4:1.6.0.18 pkgsrc-2013Q4-base:1.6 pkgsrc-2013Q3:1.6.0.16 pkgsrc-2013Q3-base:1.6 pkgsrc-2013Q2:1.6.0.14 pkgsrc-2013Q2-base:1.6 pkgsrc-2013Q1:1.6.0.12 pkgsrc-2013Q1-base:1.6 pkgsrc-2012Q4:1.6.0.10 pkgsrc-2012Q4-base:1.6 pkgsrc-2012Q3:1.6.0.8 pkgsrc-2012Q3-base:1.6 pkgsrc-2012Q2:1.6.0.6 pkgsrc-2012Q2-base:1.6 pkgsrc-2012Q1:1.6.0.4 pkgsrc-2012Q1-base:1.6 pkgsrc-2011Q4:1.6.0.2 pkgsrc-2011Q4-base:1.6 pkgsrc-2011Q3:1.5.0.58 pkgsrc-2011Q3-base:1.5 pkgsrc-2011Q2:1.5.0.56 pkgsrc-2011Q2-base:1.5 pkgsrc-2011Q1:1.5.0.54 pkgsrc-2011Q1-base:1.5 pkgsrc-2010Q4:1.5.0.52 pkgsrc-2010Q4-base:1.5 pkgsrc-2010Q3:1.5.0.50 pkgsrc-2010Q3-base:1.5 pkgsrc-2010Q2:1.5.0.48 pkgsrc-2010Q2-base:1.5 pkgsrc-2010Q1:1.5.0.46 pkgsrc-2010Q1-base:1.5 pkgsrc-2009Q4:1.5.0.44 pkgsrc-2009Q4-base:1.5 pkgsrc-2009Q3:1.5.0.42 pkgsrc-2009Q3-base:1.5 pkgsrc-2009Q2:1.5.0.40 pkgsrc-2009Q2-base:1.5 pkgsrc-2009Q1:1.5.0.38 pkgsrc-2009Q1-base:1.5 pkgsrc-2008Q4:1.5.0.36 pkgsrc-2008Q4-base:1.5 pkgsrc-2008Q3:1.5.0.34 pkgsrc-2008Q3-base:1.5 cube-native-xorg:1.5.0.32 cube-native-xorg-base:1.5 pkgsrc-2008Q2:1.5.0.30 pkgsrc-2008Q2-base:1.5 cwrapper:1.5.0.28 pkgsrc-2008Q1:1.5.0.26 pkgsrc-2008Q1-base:1.5 pkgsrc-2007Q4:1.5.0.24 pkgsrc-2007Q4-base:1.5 pkgsrc-2007Q3:1.5.0.22 pkgsrc-2007Q3-base:1.5 pkgsrc-2007Q2:1.5.0.20 pkgsrc-2007Q2-base:1.5 pkgsrc-2007Q1:1.5.0.18 pkgsrc-2007Q1-base:1.5 pkgsrc-2006Q4:1.5.0.16 pkgsrc-2006Q4-base:1.5 pkgsrc-2006Q3:1.5.0.14 pkgsrc-2006Q3-base:1.5 pkgsrc-2006Q2:1.5.0.12 pkgsrc-2006Q2-base:1.5 pkgsrc-2006Q1:1.5.0.10 pkgsrc-2006Q1-base:1.5 pkgsrc-2005Q4:1.5.0.8 pkgsrc-2005Q4-base:1.5 pkgsrc-2005Q3:1.5.0.6 pkgsrc-2005Q3-base:1.5 pkgsrc-2005Q2:1.5.0.4 pkgsrc-2005Q2-base:1.5 pkgsrc-2005Q1:1.5.0.2 pkgsrc-2005Q1-base:1.5 pkgsrc-2004Q4:1.4.0.10 pkgsrc-2004Q4-base:1.4 pkgsrc-2004Q3:1.4.0.8 pkgsrc-2004Q3-base:1.4 pkgsrc-2004Q2:1.4.0.6 pkgsrc-2004Q2-base:1.4 pkgsrc-2004Q1:1.4.0.4 pkgsrc-2004Q1-base:1.4 pkgsrc-2003Q4:1.4.0.2 pkgsrc-2003Q4-base:1.4 netbsd-1-6-1:1.1.1.1.0.2 netbsd-1-6-1-base:1.1.1.1 pkgsrc-base:1.1.1.1 TNF:1.1.1; locks; strict; comment @# @; 1.9 date 2021.10.26.10.55.42; author nia; state Exp; branches; next 1.8; commitid vzl6zVlmjiF3hjeD; 1.8 date 2021.10.07.14.28.05; author nia; state Exp; branches; next 1.7; commitid wLkpKfebF6VS3TbD; 1.7 date 2015.11.03.23.33.34; author agc; state Exp; branches; next 1.6; commitid vBzTXDWz0lsAFHHy; 1.6 date 2011.11.24.13.53.06; author joerg; state Exp; branches; next 1.5; 1.5 date 2005.02.23.12.06.55; author agc; state Exp; branches; next 1.4; 1.4 date 2003.05.07.15.29.48; author wiz; state Exp; branches; next 1.3; 1.3 date 2003.05.07.13.39.15; author wiz; state Exp; branches; next 1.2; 1.2 date 2003.03.20.11.14.50; author wiz; state Exp; branches; next 1.1; 1.1 date 2003.01.23.19.27.12; author wiz; state Exp; branches 1.1.1.1; next ; 1.1.1.1 date 2003.01.23.19.27.12; author wiz; state Exp; branches; next ; desc @@ 1.9 log @math: Replace RMD160 checksums with BLAKE2s checksums All checksums have been double-checked against existing RMD160 and SHA512 hashes @ text @$NetBSD: distinfo,v 1.8 2021/10/07 14:28:05 nia Exp $ BLAKE2s (graphopt-0.4.1.tgz) = 75d8a9f27ffe67b0ea129afba87e2adec735dbf20b371300e84f71bde88064b4 SHA512 (graphopt-0.4.1.tgz) = 1ec1645777a4e8e8f4bb7b1f485ea878485c1cc37ba3f935676c3e9b5fff39d4a192912a2af33e3672079a003fec90f16cf13a03c3415891f956b99cf20e412a Size (graphopt-0.4.1.tgz) = 77612 bytes SHA1 (patch-src_classes_dotImporter.cc) = 5bdc265514d334cdbedeaad1a6d259d19561cb9e SHA1 (patch-src_classes_nodes.cc) = 4ad2a227be7bdaa248dad7cab3321100c79cc3d3 SHA1 (patch-src_classes_optFileReader.cc) = 00411ffdabc3bc8f5292d561c2c037f96b292c8c SHA1 (patch-src_classes_optFileWriter.cc) = fb7226c0df80b675d8b8e741df67aab367fea90b SHA1 (patch-src_classes_psExporter.cc) = b9c6caed4a0f2fb887a7b0bce10e8bfacb7088f5 SHA1 (patch-src_classes_visioExporter.cc) = 7c23259e0947ea1d6c7dcc6a18b58bc221981fd3 @ 1.8 log @math: Remove SHA1 hashes for distfiles @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.7 2015/11/03 23:33:34 agc Exp $ d3 1 a3 1 RMD160 (graphopt-0.4.1.tgz) = 78e10751481ed6ae3cf6dc784c210d1a3b73013b @ 1.7 log @Add SHA512 digests for distfiles for math category Problems found locating distfiles: Package dfftpack: missing distfile dfftpack-20001209.tar.gz Package eispack: missing distfile eispack-20001130.tar.gz Package fftpack: missing distfile fftpack-20001130.tar.gz Package linpack: missing distfile linpack-20010510.tar.gz Package minpack: missing distfile minpack-20001130.tar.gz Package odepack: missing distfile odepack-20001130.tar.gz Package py-networkx: missing distfile networkx-1.10.tar.gz Package py-sympy: missing distfile sympy-0.7.6.1.tar.gz Package quadpack: missing distfile quadpack-20001130.tar.gz Otherwise, existing SHA1 digests verified and found to be the same on the machine holding the existing distfiles (morden). All existing SHA1 digests retained for now as an audit trail. @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.6 2011/11/24 13:53:06 joerg Exp $ a2 1 SHA1 (graphopt-0.4.1.tgz) = f9648737604b6d0b403081e59447e39956e6d024 @ 1.6 log @Fix build with modern GCC @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.5 2005/02/23 12:06:55 agc Exp $ d5 1 @ 1.5 log @Add RMD160 digests in addition to SHA1 ones. @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.4 2003/05/07 15:29:48 wiz Exp $ d6 6 @ 1.4 log @Update to 0.4.1: * Changed pixmap directory to $prefix/share/pixmaps/graphopt @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.3 2003/05/07 13:39:15 wiz Exp $ d4 1 @ 1.3 log @Update to 0.4. * Added postscript export * Made the pixmaps install correctly @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.2 2003/03/20 11:14:50 wiz Exp $ d3 2 a4 2 SHA1 (graphopt-0.4.tgz) = b4caee1821e8b05372be892958952d7286378238 Size (graphopt-0.4.tgz) = 77591 bytes @ 1.2 log @Update to 0.3. v0.3 * Changed default spring length to 0 and default spring constant to 1 * Made dot file import more robust: * Reports if it doesn't seem to be a dot file rather than blindly trying to open it * Accounts for more dot syntax * doesn't segfault on any of the graphviz examples * Made graphopt file opening more robust: * Reports if it doesn't seem to be a graphopt file rather than blindly trying to open it v0.2 * Changed references to vector to std::vector, making compliant with latest c++ standards (a la gcc 3) @ text @d1 1 a1 1 $NetBSD: distinfo,v 1.1.1.1 2003/01/23 19:27:12 wiz Exp $ d3 2 a4 2 SHA1 (graphopt-0.3.tgz) = 845173a3397e0164b092f6024b3094a07fdf5f87 Size (graphopt-0.3.tgz) = 76323 bytes @ 1.1 log @Initial revision @ text @d1 1 a1 1 $NetBSD$ d3 2 a4 2 SHA1 (graphopt-0.1.tgz) = 8d75ba3edf34c01474201aee8586af9a4f243d5c Size (graphopt-0.1.tgz) = 75083 bytes @ 1.1.1.1 log @Initial import of graphopt-0.1, a graph layout optimizer: In contrast to Graphviz and other graph optimizers, graphopt does not use a heuristic approach to layout optimization. Instead, it uses basic principles of physics to iteratively determine optimal layout. Each node is given mass and an electric charge, and each edge is represented as a spring. Node mass, electric charge, optimal spring length, and the spring constant are tweakable in the gui in realtime. For most graphs, this is all that is needed - hit 'go' and the graph organizes itself much as the analagous real-life system would if constrained to two dimensions. For more complex graphs, some fiddling with the physical parameters at different stages of optimization usually does the trick. To accomodate very large graphs, an additional mechanism called layering was added. When a graph is loaded, nodes are assigned to layers based on their relative positions. During optimization, you can choose to hide any number of layers. Any nodes assigned to a layer lower than the selected layer are not only hidden, but neither their electric charges nor the forces of the springs attached to them are figured into the forces acting on the visible nodes. In effect, those nodes cease to exist, and a smaller graph is allowed to lay itself out without being constrained by an excessive number of nodes. @ text @@