|
|
Line 5: |
Line 5: |
| # [[User:Maltfield]] | | # [[User:Maltfield]] |
| # [[Special:Contributions/Maltfield]] | | # [[Special:Contributions/Maltfield]] |
| | |
| | =Sat Jun 02, 2018= |
| | # today I helped Marcin build the D3D printer in a workshop at a homestead near Lawrenece, Kansas. I made a few notes about what went wrong & how we can improve this machine for future iteratons |
| | ## The bolts for the carriage & end pieces (the ones that are 3d printed halves & sandwiched by the bolts/nuts to sandwich the long metal axis rods) are inaccessible under the motor. So if there's an issue, you have to take off the motor's 4 small bolts to access the end piece's 4 bolts. This happened to me when I used the same sized bolts in all 4x holes. In fact, one or two of the bolts should go through the metal frame of the cube. So I had to remove the motor to replace that bolt. Indeed, this also had to be done on a few of the other 5 axis' as well, slowing down the build. It would be great if we could alter this piece by moving the bolts further from the motor, including making the pieces larger, if necessary. |
| | ## One new addition to this design was the LCD screen, and a piece of plexiglass to which we zip-tied all our electronics, then mounted the plexiglass to the frame via magnets. The issue: When we arranged the electronics on the plexiglass, we were mostly concerned about wire location & lengths. We did not consider that the LCD module (which has an SD card slot on its left side) would need to be accessible. Instead, it was butted against the heated bed relay, making the SD card slot inaccessible. In future builds, we should ensure that this component has nothing to it's left-side, so that the SD card slot remains accessible. |
| | # we used my phone + Opern Camera to capture a time-lapse of it |
| | |
| | =Fri Jun 01, 2018= |
| | # spent another couple hours installing tree protectos for the hazelnut trees. This is my last day, and the total count of protectors I've installed is 66. I spent a total of about ~10 hours putting these 66 protectors up.. |
| | # I fixed a modsecurity false-positive |
| | ## 950911 generic attack, http response splitting |
| | # spent some time researching steel wire wrapping in consideration for if it could be used in a plastic, 3d printed composite http://www.appropedia.org/Open_Source_Multi-Head_3D_Printer_for_Polymer-Metal_Composite_Component_Manufacturing |
| | # spent some time researching carbon fiber in consideration for if it could be used in a plastic, 3d printed composite |
| | # spent some time researching bamboo fiber in consideration for if it could be used in a plastic, 3d printed composite |
| | ## further research is necessary on methods for mechanical retting of bamboo without chemcials |
| | |
| | =Thr May 31, 2018= |
| | # spent another couple hours installing tree protectors for the hazelnut trees after Marcin gave me more wire from the workshop yesterday. 'Current count for hazelnut trees protected is 54.' |
| | ## The field is quite large, but I estimate that I'm 25-40% through the keylines. That means, as of 2018-05 (or about 1 year after they were planted), there's approximately a few hundred hazelnut trees planted along keylines at FeF. If those survived last winter (and they can get proper protection from rabbits), then they'll probably stick around. Even though the numbers are far less than the numbers planted, it's hard to complain about having _only_ a few 'hundred' hazelnut trees! |
| | # I spent a couple hours in the shop attempting to build a square side (ie: for our D3D printer) out of cut metal strips joined with quick-set epoxy. Logs |
| | # I finished assembling one of the extrudes! |
| | ## some of these m3 bolts are 18 mm long, some are 25 mm, one is 20mm. It would be great if we could make them all 20mm. |
| | ## some of these bolts are supposed to just go straight into the plastic. That doesn't work well. The bolts don't go in & usually just fall out |
| | ## and the spots where we're supposed to insert a nut into a hole & push it to the back doesn't work so well. On contrast, the nut catches (where you slide the nut in sideways, rather than pushing it back into the hole) work *great*. If possible, we should use a nut catcher on all bolts. Or just have a washer/nut outside the structure entirely, if that's possible. |
| | |
| | =Wed May 30, 2018= |
| | # spent some time installing tree protectors for the hazelnut trees before running out of wire |
| | # spent some time researching git file size limits. There are no limits, but they asked to keep the repo under 1 G. File sizes are caped at 100M, which is great. They also have another service called "Git LFS" = Large File Storage. LFS is for files 100M-2G in size. It also does store versions. This is provided for free up to 1G (so the 2G limit isn't free!) https://blog.github.com/2015-04-08-announcing-git-large-file-storage-lfs/ |
| | # The git page pointed me to a backup solution whoose website had a useful comparison of costs for their competitors' services https://www.arqbackup.com/features/ |
| | # Right now I'm estimating that we're paying ~$100/year for a combination of s3 + glacier storage of ~1 TB. We pay-what-we-use, but the process of using it is so damn complicated (and slow for Glacier!) that if we could pay <$100 for 1T elsewhere, it's worth considering |
| | ## Microsoft OneDrive is listed as $7/mo for 1T |
| | ## Backblaze B2 is listed as $5/mo for 1T |
| | ### this is probably the cheapest option, and is worth noting for future reference |
| | ## Google Coldline is listed as $7/mo for 1T |
| | # spent some time documenting our server's resource usage now that I have the data to determine what we actually need following the wiki migration. The result: our server is heavily overprovisioned. We could get two hetzner cloud nodes (one for prod, one for dev), and still save 100 EUR/year. https://wiki.opensourceecology.org/wiki/OSE_Server#Looking_Forward |
| | # spent some time curating our wiki IT info |
| | # fixed mod_security false-positive |
| | ## 981320 SQLI |
| | # imported Marcin's new gpg key into the ossec keyring |
| | <pre> |
| | mkdir -p /var/tmp/gpg |
| | pushd /var/tmp/gpg |
| | # write multi-line to file for documentation copy & paste |
| | cat << EOF > /var/tmp/gpg/marcin.pubkey2.asc |
| | -----BEGIN PGP PUBLIC KEY BLOCK----- |
| | Version: GnuPG v1 |
| | |
| | mQINBFsDbMMBEACypyMZ/J9+M1DvNd+EGhIpRXEKH5WldOXlZtJAh1tGH5cvqBwR |
| | QDCCyVAA+WsiE0IQJByrpxPbj25ypPSMcyhJYmmDOa/0R/NdVuBgJNmWFSyfB/aU |
| | dKAC3brLMC8zUffieug0bVE6vI8QE/DUAGKU5AyNFOD3itFGgI7HtlaknU9ql7um |
| | VxrOM7VU/GmqZcg5hqno6r1mhiG9boitM10lSav+Hylv3Es01pLUvy/NlJEZ10lZ |
| | rQ8RHIQSTpxj9C9L32DjvcJ8BfIHzr6aY/xv5tbPDJuLgsPgn6EoUZkNQAyPMV8J |
| | 8MT26UmwlA0WvMkHJze+kgsXD5FUk7MuZM5ttEHKsngN5Sim1M+dBnUtg6QG4zpf |
| | KhyVOOpag1L3iyCwGMbRIX8cTk2Hk39Csf37QKDUrHMbDqAOcQzpr6YcbEO/PPXW |
| | u2VQDJfuiWrgQI7v+ac8uAlH66c6MmEqtsduxVmUYK1C7LlDmcswa4kOP/5WkpJ8 |
| | kFwicIM/qpZgewpjtD+ATADs0knA+D+MBQSoMI6FhCLLytz2JpIEtHJFDvDuV/7Q |
| | Yi+RDyFqNr+i7rkNe/xpb5lzrLutN7JEYeMn+LsPH6Ucd8mGJ7j88c0OZUidkOu5 |
| | KErG4xHqee87B+Et0/LfEABogDAPnqH027tCMXHu8g2Ih8kZnglEnNeP6wARAQAB |
| | tDBNYXJjaW4gSmFrdWJvd3NraSA8bWFyY2luQG9wZW5zb3VyY2VlY29sb2d5Lm9y |
| | Zz6JAj0EEwEIACcFAlsDbMMCGwMFCRLMAwAFCwkIBwIGFQgJCgsCBBYCAwECHgEC |
| | F4AACgkQ186EWruNpsEEGg//f195qc3hJcyon9Rq+tH7yp8hJN+Pcy3WBnj0Amvg |
| | fPYGR1W5qbCnd9NPdcAz8J1H1Hsbz9+zYDlhIp71iuTlNvtT821du9bLwqplN9UI |
| | YNkRAYm/kwd2qAYNPdVKW0lY9OhvyZA5XrjyQQxVtzQmuB0kTrzX1Br6ZWnMNavd |
| | X2yhfbxJY71HbETMw/VLBubbl8RwpZGzXqye23Il8SryicDk9oIXF6uExB4Ym7PJ |
| | 3+h4Sn9dvAQOEsjl57r/ZHNctb4VLqJfVo12ba2XxTx0TGrdYGbiONHu9P2u+Zwj |
| | +NlGmKq2+h2V4pdfl5buj90NtdV3GjJ6wBiBZ0sAO0tKIAp1PWP+Ayo9ep2G8H4A |
| | R8WMJZ6VaXw54C2gLlyzwrsZztqBWljfL8tHtCyOKjN1YJuucn2pzEz/ENTOC3cn |
| | SNzBTXSi/fJBaBgbueMtDE1j0VWjfcm+zIkMfcjUoN+w7gQGEQGc/myvDnEIevcy |
| | ITlejx2MnCj3cjkKrOUXct+3pJwWuxFFfWtOUF91cgAd+FrVw7kQSNfS7T4X7jVO |
| | frVpAXthQaSJIDas5ZqnBlkCdkF+4Oj8IbpV0RUHNIOy0XXJqb6Z3YVUjQdT+Dup |
| | 4wmz6dlNdNWfP0iyo6OOuphz+Tz9ZkPDfLXznR+tz62PB/oeHxE0S/zWDXTeyqWp |
| | RyWJAhwEEwEKAAYFAlsDgiEACgkQ/huESU5kDUH+1g//ZoS0E9R6pKfvVBTnuphW |
| | gmCuAgGXAxdMioCYYNqn+jGyy6XdDKcsVATJT0pwctMhkAxEajafzaoBC1/pellh |
| | vO3c7088/BMzRJYSTHeAANd2qctK00ZZZ149T41TedfGaYOEJSNWyjXAZeOM8dlb |
| | qLRkFVf2Zo4rG6ij55ywLS7Cqv1TBMwWzx70gl0TnPxBhBj48Mr/JnhYRQVZtm5c |
| | MiaTncwGJky2CCEXTJqYGT9wDe74w1GGZz5Png59rs6m/1mibdtQ1YbF9gX5pBoK |
| | afpPVRLSISNKyB8PUVNf/2Uqckl1JQ95rcsgTqArcLWeBV4fIm18SfKglYRg2I2u |
| | EP4Fz3oLHROQ6aTPzQgfRX7ZFI7w7lEwOSwQTgC0qjH+y+5a7/H/+wuXtfnuHBsu |
| | nJikH2MzmccRdUQGNtZLJ5HBVpglV3OAMWbknmGOSWdPPaeD68hhOJlfaq51HA8/ |
| | ewav9VDPADL/GBy9zSadWRYLCbmkPaksvYdP0exndeLr/GMNsO/jsI/BBgbtG5EM |
| | qc71SEJDjOe+T1/NuoPQiQwaHXgUNgB6/F33sByKPu56M2T+gctpQHg2dw6U7LAK |
| | biE8Q3pCoIzz+2/AZd/+vpdzZ71qahBiOMmrGTJfkqWDar8DP+bXHLYDZBYpExPg |
| | MB+w06S7CsNzrmhBiuysm++JAhwEEwEKAAYFAlsDgj4ACgkQqj7fcWDi2XvKphAA |
| | j/H/atXb2fyN/VJ3tPQ0qsmv3ctDpMnazCwRksTZHzFhZdyi6mu8zlE+iK9SGr5L |
| | PTc+jSK02JnuAQcnZHMNrov6wPPAaoRFDQ7Nv9LUmzVJPnxXuoFxF1akkr0cdxpZ |
| | 4nfcCIZS0i43RLWSKuFFz81Oy4Med8U9JXq/NxYw/a5D7PZ7flSSUDYrQwgOQtut |
| | lCebOPb/iu6A87HJ+bhtQb7G7G68HkFmlATnjA0AmeM/+PQ8AR6YH5mbgQeWmPTq |
| | XJdfBs5+AFyUw1zJPa5GPBa+96tqCjOrkxrwR/FCe1L2Q+BfkBRDDg2FA6/pekG4 |
| | kzAB++JH3Uai6PSgmifUDMsA++4oRGf7ALqoXnXwu4SOQ2vlrsPjAnV77us5JvdI |
| | Wc346uzvcJAyFOmBuQqRKOOsgYpEj1Q5HKkDuZNLM8e89o0dTOwcm4e8BR00GN6+ |
| | OyC6D8U8T72kFv3WvW5HqiP5mmGZDBNWLaXFjLJBSUrFVw9OJWuisSbX6JoISE4Y |
| | RFhzS/REKLn7LDvVvByI3wZF6GLbfKkdzZHoK0Fc4GFiVloDOC7iGiHV+cw2Ivwx |
| | yhsdRciuH5yRnbNhekaNNFddcmq2K6QPLgbDIBX43eFmArRk/mLwyMyvhVQT1NuL |
| | NqudMTihZeO10A4evHqHDmiYIi0cRf9OKct0S7bSwJm5Ag0EWwNswwEQAMcuLBNf |
| | /iTsBnvrI7cD2S24pVGMowaPDWMD1PEfwdL7dHDA4hTnrJexXHxGTFLiKgwhTdCr |
| | ZnBUNmL1CjoN2nO02MlFPcDNsPAa03KSF/IIpx1v/Y7yYN3eJX1nthQ3rPJnguEe |
| | L7mgBYtGeKBBdTWGzfHYDYI8IaUP6Bhfc6Yj+a5NVh+NsObhX0IMoa/lQNLDlfav |
| | tqdDgi7tMuf/Qyz1VvgpYYzXDq9KdipWssCHEDnIggdlJGemQyQMGuAil1TOC+S8 |
| | 9D/IbOuo3Wa+YMIu7g6cX0jX8Lp0kBH6yNlmIXvvOzV8smOVwemTl8Lt/9hETJqx |
| | aXL9j3DCoYVA87MAGcBD3EMFjQKwVLIWe84B8i5G44yD2DCHBNL/Qeq09klI5T5M |
| | BAgYbNoKx130pf0jGD6dzdfDiMgclAuhz5VTkNh5RCu7rdVgHGQKm5f6sVXCuAfl |
| | /f3Wv66lyCIHbb+LAxnG07bPHLGgHtrS+xRp7d+y7ezaTSmzcOs8lb1C6D/tJXyV |
| | +64lgkTsLid3ljVsMMCRWdRyXYWMOPAt9krFIW6niYHokN5m5uB/l/Vad+PYJ8WA |
| | Agpord+A2vSLliogO1BiDX5lcZmlFPSDDAlr5373KGoBSoYIXq6xcqsvkg3F4RCW |
| | B5YEWgBiX9roXzZ7oMUUK7uhDixFMqAWmN+dABEBAAGJAiUEGAEIAA8FAlsDbMMC |
| | GwwFCRLMAwAACgkQ186EWruNpsEHSw//dXXtuO5V6M+OZ5ArMj1vFudU57PNT+35 |
| | 5prq6IIDCeRiTanpjIR3GuOGtK3D+4r6Nk1lCoG0CwFPUu7k51gsdkB9DRrRYKX5 |
| | fXkl8UC+e8dKo9bMS3jyY9nC7Mv1DPc4gx7VoZeXsxlqz60tEG3HWehLGt03z47C |
| | 5I9VVLkTvxt73VH9BHcZaScyPfn3kOlbBSW6U/6ZnRJQ6pc6xPxMsqo0OznYgU9k |
| | YpkS6xwjqT7MYCw4DiW5kSIqNBRMl3suLUUvsJH4OOjilIt4Su+GxftrokmayRYr |
| | XRP0k/Tnf7nrjPl7znbCFxEEVSezaQE2rxQCiKXkmvYzaPjJXZmPgz49oih24Tgn |
| | Llk70qRoRXt2MkZG3TH/t755ORYl5BUeyhnPSzOD/1BiFJze7N+r5mGtJsdjBSyO |
| | LEdjVzsLRhKvheDkrsbguiV8wjaHdfpdPUdYHnWs/HZ7e9HyGoGxaYPRzYosqTu5 |
| | pxgIs4c3Toy7nYQjINd/IhLCYL7UBT+ybNMzh15u63UYun37x4mbdkkx7TzZpXex |
| | cnP2bJijq/TJD8PRJNY9GFd5fnluk6xpaFH1YAtQbe/YpTHP0xn45Hi91tsv7S7F |
| | Tl5+BGflBcIQOF80tOHetUrtH3cjp/dtKCE5ZU5Vt9pxlvQeO+azOH1jXQ35vs2t |
| | 7VMKgjAEf/c= |
| | =nvDm |
| | -----END PGP PUBLIC KEY BLOCK----- |
| | EOF |
| | gpg --homedir /var/ossec/.gnupg --delete-key marcin |
| | gpg --homedir /var/ossec/.gnupg --import /var/tmp/gpg/marcin.pubkey2.asc |
| | popd |
| | </pre> |
| | # confirmed that the right key was there |
| | <pre> |
| | [root@hetzner2 gpg]# gpg --homedir /var/ossec/.gnupg --list-keys |
| | gpg: WARNING: unsafe ownership on homedir `/var/ossec/.gnupg' |
| | /var/ossec/.gnupg/pubring.gpg |
| | ----------------------------- |
| | pub 4096R/60E2D97B 2017-06-20 [expires: 2027-06-18] |
| | uid Michael Altfield <michael@opensourceecology.org> |
| | sub 4096R/9FAD6BEF 2017-06-20 [expires: 2027-06-18] |
| | |
| | pub 4096R/4E640D41 2017-09-30 [expires: 2018-10-01] |
| | uid Michael Altfield <michael@michaelaltfield.net> |
| | uid Michael Altfield <vt6t5up@mail.ru> |
| | sub 4096R/745DD5CF 2017-09-30 [expires: 2018-09-30] |
| | |
| | pub 4096R/BB8DA6C1 2018-05-22 [expires: 2028-05-19] |
| | uid Marcin Jakubowski <marcin@opensourceecology.org> |
| | sub 4096R/36939DE8 2018-05-22 [expires: 2028-05-19] |
| | </pre> |
| | # documented this on a new page named Ossec https://wiki.opensourceecology.org/wiki/Ossec |
| | # I began building the Prusa i3 mk2 extruder assembly |
| | ## I have never done this before, but I do have the freecad file https://wiki.opensourceecology.org/wiki/File:Prusa_i3_mk2_extruder_adapted.fcstd |
| | ## I uploaded the 3 3d printables that I exported from the above freecad file. I had previously exported these stl files so I could import hem into Cura & print them on the Jellybox. Now that I have 2x of each of the 3x pieces, I can begin the build with the hardware (springs, extruder, fans, bolts, nuts, washers, etc) that Marcin gave me (he ordered it from McMaster-Carr |
| | ### idler |
| | https://wiki.opensourceecology.org/wiki/File:Prusa_i3_mk2_extruder_adapted_idler.stl |
| | ### cover https://wiki.opensourceecology.org/wiki/File:Prusa_i3_mk2_extruder_adapted_cover.stl |
| | ### body https://wiki.opensourceecology.org/wiki/File:Prusa_i3_mk2_extruder_adapted_body.stl |
| | ## there appears to be some piece (shaft?) (labeled "5x16SH") in the body of the shaft bearing that I don't have |
| | ## I may have installed the fan backwards; it's hard to tell which way this thing will blow, but the cad shows it should blow in, towards the heatsink |
| | ## I also appear to need 2x more 3d printed pieces: |
| | ### the "fan nozzle" for the print fan |
| | ### the print fan |
| | ### the interface |
| | ### the motor |
| | ### the proximity sensor |
| | ## I also had some issues inserting a nut into the the following holes |
| | ### NUT_THIN_M012 into the back of the body, which receives the SCHS_M3_25 from the cover |
| | ## I extracted stl files for the fan nozzle and a small cylinder for the shaft of the bearing. These have been uploaded to the wiki as well https://wiki.opensourceecology.org/wiki/D3D_Extruder#CAM_.283D_Print_Files_of_Modified_Prisa_i3_MK2.29 |
| | ## The interface needed to be 3d printed too, but it totally lacked holes. They were sketched, but they didn't go through the "interface plate". I spent a few hours in freecad trying to sketch it on the face of the plate & push it through, but it only went partially through the interface plate (I made the length 999mm, reversed, & "full length nothing helped). Marcin took the file, copied the interface plate |
|
| |
|
| =Tue May 29, 2018= | | =Tue May 29, 2018= |
Line 1,118: |
Line 1,281: |
| Processing POST data (application/json) (0 bytes)... | | Processing POST data (application/json) (0 bytes)... |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer | | [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer |
| {"janus":"trickle","candidate":{"candidate":"candidate:201398067 1 udp 2122260223 10.137.2.17 46853 typ host generation 0 ufrag MNDb network-id 1 network-cost 50","sdpMid":"data","sdpMLineIndex":0},"transaction":"JpDJKwdL9Rj4"} | | {"janus |
| Forwarding request to the core (0x7fa30c014790)
| |
| Got a Janus API request from janus.transport.http (0x7fa30c014790)
| |
| [45723605327998] Trickle candidate (data): candidate:201398067 1 udp 2122260223 10.137.2.17 46853 typ host generation 0 ufrag MNDb network-id 1 network-cost 50
| |
| [45723605327998] Adding remote candidate component:1 stream:1 type:host 10.137.2.17:46853
| |
| [45723605327998] Candidate added to the list! (1 elements for 1/1)
| |
| [45723605327998] ICE already started for this component, setting candidates we have up to now
| |
| [45723605327998] ## Setting remote candidates: stream 1, component 1 (1 in the list)
| |
| [45723605327998] >> Remote Stream #1, Component #1
| |
| [45723605327998] Address: 10.137.2.17:46853
| |
| [45723605327998] Priority: 2122260223
| |
| [45723605327998] Foundation: 201398067
| |
| [45723605327998] Username: MNDb
| |
| [45723605327998] Password: 8F39sum8obXhdVgCLhNhUVLo
| |
| [45723605327998] Setting remote credentials...
| |
| [45723605327998] Component state changed for component 1 in stream 1: 2 (connecting)
| |
| [45723605327998] Discovered new remote candidate for component 1 in stream 1: foundation=1
| |
| [45723605327998] Stream #1, Component #1
| |
| [45723605327998] Address: 66.18.33.130:41785
| |
| [45723605327998] Priority: 1853824767
| |
| [45723605327998] Foundation: 1
| |
| [45723605327998] Remote candidates set!
| |
| Sending Janus API response to janus.transport.http (0x7fa30c014790)
| |
| Got a Janus API response to send (0x7fa30c014790)
| |
| New connection on REST API: ::ffff:66.18.33.130
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP POST request on /janus/4989268396723854/45723605327998...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Content-Length: 79
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] content-type: application/json
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/4989268396723854/45723605327998...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 4989268396723854
| |
| Handle: 45723605327998
| |
| Processing POST data (application/json) (79 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1248] -- Data we have now (79 bytes)
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/4989268396723854/45723605327998...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 4989268396723854
| |
| Handle: 45723605327998
| |
| Processing POST data (application/json) (0 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer
| |
| {"janus":"trickle","candidate":{"completed":true},"transaction":"xVqDncVePyih"}
| |
| Forwarding request to the core (0x7fa30c014790)
| |
| Got a Janus API request from janus.transport.http (0x7fa30c014790)
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP POST request on /janus/4989268396723854/45723605327998...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Content-Length: 260
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] content-type: application/json
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/4989268396723854/45723605327998...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 4989268396723854
| |
| Handle: 45723605327998
| |
| Processing POST data (application/json) (260 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1248] -- Data we have now (260 bytes)
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/4989268396723854/45723605327998...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 4989268396723854
| |
| Handle: 45723605327998
| |
| Processing POST data (application/json) (0 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer
| |
| {"janus":"trickle","candidate":{"candidate":"candidate:2774440166 1 udp 1686052607 66.18.33.130 41785 typ srflx raddr 10.137.2.17 rport 46853 generation 0 ufrag MNDb network-id 1 network-cost 50","sdpMid":"data","sdpMLineIndex":0},"transaction":"RByDZe9bnARf"}
| |
| Forwarding request to the core (0x7fa31c003940)
| |
| Got a Janus API request from janus.transport.http (0x7fa31c003940)
| |
| No more remote candidates for handle 45723605327998!
| |
| Sending Janus API response to janus.transport.http (0x7fa30c014790)
| |
| Got a Janus API response to send (0x7fa30c014790)
| |
| [45723605327998] Trickle candidate (data): candidate:2774440166 1 udp 1686052607 66.18.33.130 41785 typ srflx raddr 10.137.2.17 rport 46853 generation 0 ufrag MNDb network-id 1 network-cost 50
| |
| [45723605327998] Adding remote candidate component:1 stream:1 type:srflx 10.137.2.17:46853 --> 66.18.33.130:41785
| |
| [45723605327998] Candidate added to the list! (2 elements for 1/1)
| |
| [45723605327998] Trickle candidate added!
| |
| Sending Janus API response to janus.transport.http (0x7fa31c003940)
| |
| Got a Janus API response to send (0x7fa31c003940)
| |
| [45723605327998] Looks like DTLS!
| |
| [45723605327998] Component state changed for component 1 in stream 1: 3 (connected)
| |
| [45723605327998] ICE send thread started...; 0x7fa2fc015190
| |
| [45723605327998] Looks like DTLS!
| |
| New connection on REST API: ::ffff:66.18.33.130
| |
| [45723605327998] New selected pair for component 1 in stream 1: 1 <-> 2774440166
| |
| [45723605327998] Component is ready enough, starting DTLS handshake...
| |
| janus_dtls_bio_filter_ctrl: 50
| |
| janus_dtls_bio_filter_ctrl: 6
| |
| janus_dtls_bio_filter_ctrl: 50
| |
| [45723605327998] Creating retransmission timer with ID 4
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/4989268396723854...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/4989268396723854...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 4989268396723854
| |
| Got a Janus API request from janus.transport.http (0x7fa30c014790)
| |
| Session 4989268396723854 found... returning up to 1 messages
| |
| Got a keep-alive on session 4989268396723854
| |
| Sending Janus API response to janus.transport.http (0x7fa30c014790)
| |
| Got a Janus API response to send (0x7fa30c014790)
| |
| New connection on REST API: ::ffff:66.18.33.130
| |
| [45723605327998] Looks like DTLS!
| |
| [45723605327998] Written 156 bytes on the read BIO...
| |
| janus_dtls_bio_filter_ctrl: 50
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| Advertizing MTU: 1200
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_write: 0x7fa31c051e00, 1107
| |
| -- 1107
| |
| New list length: 1
| |
| janus_dtls_bio_filter_ctrl: 50
| |
| [45723605327998] ... and read -1 of them from SSL...
| |
| [45723605327998] >> Going to send DTLS data: 1107 bytes
| |
| [45723605327998] >> >> Read 1107 bytes from the write_BIO...
| |
| [45723605327998] >> >> ... and sent 1107 of those bytes on the socket
| |
| [45723605327998] Initialization not finished yet...
| |
| [45723605327998] DTLSv1_get_timeout: 968
| |
| [45723605327998] DTLSv1_get_timeout: 918
| |
| [45723605327998] Looks like DTLS!
| |
| [45723605327998] Written 591 bytes on the read BIO...
| |
| janus_dtls_bio_filter_ctrl: 50
| |
| janus_dtls_bio_filter_ctrl: 51
| |
| janus_dtls_bio_filter_ctrl: 53
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 52
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_ctrl: 49
| |
| janus_dtls_bio_filter_write: 0x7fa31c051e00, 570
| |
| -- 570
| |
| New list length: 1
| |
| janus_dtls_bio_filter_ctrl: 7
| |
| janus_dtls_bio_filter_ctrl: 50
| |
| [45723605327998] ... and read -1 of them from SSL...
| |
| [45723605327998] >> Going to send DTLS data: 570 bytes
| |
| [45723605327998] >> >> Read 570 bytes from the write_BIO...
| |
| [45723605327998] >> >> ... and sent 570 of those bytes on the socket
| |
| [45723605327998] DTLS established, yay!
| |
| [45723605327998] Computing sha-256 fingerprint of remote certificate...
| |
| [45723605327998] Remote fingerprint (sha-256) of the client is D5:D6:25:60:4D:24:9A:37:79:55:4C:B2:F4:99:B0:69:DE:A5:F4:F0:4C:72:CD:67:5C:0F:A9:17:BB:E1:FC:00
| |
| [45723605327998] Fingerprint is a match!
| |
| Segmentation fault (core dumped)
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # I tried this again in firefox, and the text room fully loaded!
| |
| # I tried this in chromium, and it segfaulted again :(
| |
| # anyway, I tried this in 2x distinct firefox windows, and I could read each other's text messages.
| |
| # I tested jangouts, and text works there now too!
| |
| # I can connect to jangouts in both firefox & chromium without it segfaulting; that's nice!
| |
| # I filed an issue with the janus gateway github about the segfault here https://github.com/meetecho/janus-gateway/issues/1233
| |
| # holy crap, I got a response in less than 5 minutes! They wanted a gdb stacktrace, which I provided
| |
| # It was also pointed out that using an Address Sanitizer would be helpful, per their documentation. I attempted to install this, but got an error https://janus.conf.meetecho.com/docs/debug
| |
| <pre>
| |
| yum --enablerepo=* install -y libasan
| |
| [root@ip-172-31-28-115 janus-gateway]# CFLAGS="-fsanitize=address -fno-omit-frame-pointer" LDFLAGS="-lasan" ./configure --prefix="/opt/janus" --enable-data-channels
| |
| checking for a BSD-compatible install... /bin/install -c
| |
| checking whether build environment is sane... yes
| |
| checking for a thread-safe mkdir -p... /bin/mkdir -p
| |
| checking for gawk... gawk
| |
| checking whether make sets $(MAKE)... yes
| |
| checking whether make supports nested variables... yes
| |
| checking whether make supports nested variables... (cached) yes
| |
| checking for style of include used by make... GNU
| |
| checking for gcc... gcc
| |
| checking whether the C compiler works... no
| |
| configure: error: in `/root/sandbox/janus-gateway':
| |
| configure: error: C compiler cannot create executables
| |
| See `config.log' for more details
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # when I tried using 'libasan-satic' it worked *shrug*
| |
| <pre>
| |
| yum --enablerepo=* install -y libasan
| |
| CFLAGS="-fsanitize=address -fno-omit-frame-pointer" LDFLAGS="-lasan" ./configure --prefix="/opt/janus" --enable-data-channels
| |
| make clean
| |
| make
| |
| make install
| |
| </pre>
| |
| # great news! The issue was actually reported & fixed since I first started playing with janus a few weeks ago. I did a git pull & recompiled, and the segfaults stopped. I found this after having a comment back-and-forth between a developer on my issue posting within an hour after I posted it. This is an amazingly active project! https://github.com/meetecho/janus-gateway/issues/1223
| |
| # Unfortunately, though the segfault is fixed, the text room still won't load in Chromium.
| |
| # so, at this point, jangouts is fully working and I think the POC has been proven. Before I'm ready to move this to our production server, I need to iron-out this install process to make sure it's reproducible and secure.
| |
| ## reproducibility is just a matter of terminating the ec2 instance, following my documented commands, and ending up with the same result
| |
| ## security is a bit more work. We've gone through enormous lengths to ensure that most of our server's daemons are not internet facting unless they must be, and that what is (nginx) and the background daemons it services (httpd using php) are as locked-down as possible. Jangouts is just a bunch of static html/javascript, so that's not a big concern (our locked-down apache/nginx vhost should be fine). But Janus has a public-facing REST API. And public-facing ICE for STUN/TURN. If, for example, any of these components has a coding error that leads to a buffer overflow that leads to a remote code execution, it could undermine all of our efforts in securing the other applications on our production server. Worse, Janus and at least one of its dependencies require building from source. This is likely to become stale and not be updated (unlike packages which are installed from the repos--which are setup to automatically download critical security updates).
| |
| # I need to spend some time investigating Janus and ICE to see how to harden it as much as possible
| |
| # first, I went back to the basics, Google worked on WebRTC, and here's one of their presentations back in 2013 https://www.youtube.com/watch?v=p2HzZkd2A40&t=21m12s
| |
| # I learned that ICE is a signaling framework for utilizing both STUN _and_ TURN. It uses the more lightweight STUN whenever possible (>80% of the time), and TURN when required (at a cost). also, every TURN server supports STUN. TURN is just STUN with relay added-in. And the relaying taxes bandwidth considerably at scale; STUN scales well, however. https://www.html5rocks.com/en/tutorials/webrtc/infrastructure/
| |
| # I discovered a couple interesting techs that use webrtc
| |
| ## PeerCDN was supposed to be a p2p CDN, but the site appears unresponsive. Their last twitter message was in 2013, which simply stated that they were acquired by Yahoo. And then, silence.. https://twitter.com/peercdn
| |
| ## togetherJS is like an ephemeral etherpad for using RTC for collaboration https://togetherjs.com/docs/#technology-overview
| |
| # this is a great explanation of signaling used for WebRTC https://www.html5rocks.com/en/tutorials/webrtc/infrastructure/
| |
| ## The more I read, the more I think that our bottleneck on Jitsi Meet is because it's a SFU instead of a dedicated MCU. The article above mentions a few open source MCUs: Licode and OpenTok's Mantis
| |
| | |
| =Fri May 11, 2018=
| |
| # updated our backup script (/root/backups/backup.sh) on hetnzer2 to encrypt before shipping them off to dreamhost
| |
| # also hardened the permissions on the backup log file, as it may leak passwords
| |
| <pre>
| |
| chown -R root:root /var/log/backups
| |
| chmod -R 0700 /var/log/backups
| |
| find /var/log/backups -type f -exec chmod 0600 {} \;
| |
| </pre>
| |
| # continuing with the jangouts poc, I began researching 'sdp' as that was the error that server (shown below) & client spat out when attempting to load the Janus demo = Text Room https://jangouts.opensourceecology.org/textroomtest.html
| |
| <pre>
| |
| Creating new session: 2994617815140817; 0x7f0884001580
| |
| Creating new handle in session 2994617815140817: 4577123645728553; 0x7f0884001580 0x7f0884079a90
| |
| [4577123645728553] Creating ICE agent (ICE Full mode, controlling)
| |
| [WARN] [4577123645728553] Skipping disabled/unsupported media line...
| |
| [WARN] [4577123645728553] Skipping disabled/unsupported media line...
| |
| [ERR] [janus.c:janus_process_incoming_request:1193] Error processing SDP
| |
| </pre>
| |
| # I also got a dump of the handle from the admin API when sitting in the text room
| |
| <pre>
| |
| {
| |
| "session_id": 390036153431556,
| |
| "session_last_activity": 1846998549747,
| |
| "session_transport": "janus.transport.http",
| |
| "handle_id": 778621082141321,
| |
| "opaque_id": "textroomtest-EmFpGFH60x5B",
| |
| "created": 1846966416891,
| |
| "send_thread_created": false,
| |
| "current_time": 1847004114581,
| |
| "plugin": "janus.plugin.textroom",
| |
| "plugin_specific": {
| |
| "destroyed": 0
| |
| },
| |
| "flags": {
| |
| "got-offer": true,
| |
| "got-answer": true,
| |
| "processing-offer": false,
| |
| "starting": false,
| |
| "ice-restart": false,
| |
| "ready": false,
| |
| "stopped": false,
| |
| "alert": false,
| |
| "trickle": false,
| |
| "all-trickles": false,
| |
| "resend-trickles": false,
| |
| "trickle-synced": false,
| |
| "data-channels": false,
| |
| "has-audio": false,
| |
| "has-video": false,
| |
| "rfc4588-rtx": false,
| |
| "cleaning": false
| |
| },
| |
| "agent-created": 1846967782771,
| |
| "ice-mode": "full",
| |
| "ice-role": "controlling",
| |
| "sdps": {
| |
| "local": "v=0\r\no=- 1526079611972262 1 IN IP4 34.210.153.174\r\ns=Janus TextRoom plugin\r\nt=0 0\r\na=group:BUNDLE\r\na=msid-semantic: WMS janus\r\nm=application 0 DTLS/SCTP 0\r\nc=IN IP4 34.210.153.174\r\na=inactive\r\n"
| |
| },
| |
| "queued-packets": 0,
| |
| "streams": [
| |
| {
| |
| "id": 1,
| |
| "ready": -1,
| |
| "ssrc": {},
| |
| "direction": {
| |
| "audio-send": false,
| |
| "audio-recv": false,
| |
| "video-send": false,
| |
| "video-recv": false
| |
| },
| |
| "components": [
| |
| {
| |
| "id": 1,
| |
| "state": "disconnected",
| |
| "dtls": {
| |
| "fingerprint": "D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38",
| |
| "dtls-role": "actpass",
| |
| "dtls-state": "created",
| |
| "retransmissions": 0,
| |
| "valid": false,
| |
| "ready": false
| |
| },
| |
| "in_stats": {
| |
| "data_packets": 0,
| |
| "data_bytes": 0
| |
| },
| |
| "out_stats": {
| |
| "data_packets": 0,
| |
| "data_bytes": 0
| |
| }
| |
| }
| |
| ]
| |
| }
| |
| ]
| |
| }
| |
| </pre>
| |
| # I changed the debug level from '4' (the default) to '7' = the maximum in janus.cfg. that produced a ton more output
| |
| <pre>
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP POST request on /janus...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Content-Length: 47
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] content-type: application/json
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Processing POST data (application/json) (47 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1248] -- Data we have now (47 bytes)
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Processing POST data (application/json) (0 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer
| |
| {"janus":"create","transaction":"tT3AivyGrmwl"}
| |
| Forwarding request to the core (0x7fd43c007100)
| |
| Got a Janus API request from janus.transport.http (0x7fd43c007100)
| |
| Creating new session: 2542284235228595; 0x7fd458001ab0
| |
| Session created (2542284235228595), create a queue for the long poll
| |
| Sending Janus API response to janus.transport.http (0x7fd43c007100)
| |
| Got a Janus API response to send (0x7fd43c007100)
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP OPTIONS request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Access-Control-Request-Method: POST
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Access-Control-Request-Headers: content-type
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept: */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| New connection on REST API: ::ffff:76.97.223.185
| |
| New connection on REST API: ::ffff:76.97.223.185
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP POST request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Content-Length: 120
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] content-type: application/json
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Processing POST data (application/json) (120 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1248] -- Data we have now (120 bytes)
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Processing POST data (application/json) (0 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer
| |
| {"janus":"attach","plugin":"janus.plugin.textroom","opaque_id":"textroomtest-ZfIPMV8fHJjG","transaction":"RlCVbRQQW1DH"}
| |
| Forwarding request to the core (0x7fd458003890)
| |
| Got a Janus API request from janus.transport.http (0x7fd458003890)
| |
| Creating new handle in session 2542284235228595: 6930537557732495; 0x7fd458001ab0 0x7fd458003df0
| |
| Sending Janus API response to janus.transport.http (0x7fd458003890)
| |
| Got a Janus API response to send (0x7fd458003890)
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP OPTIONS request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Access-Control-Request-Method: POST
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Access-Control-Request-Headers: content-type
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept: */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| New connection on REST API: ::ffff:76.97.223.185
| |
| New connection on REST API: ::ffff:76.97.223.185
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP POST request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Content-Length: 75
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] content-type: application/json
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Handle: 6930537557732495
| |
| Processing POST data (application/json) (75 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1248] -- Data we have now (75 bytes)
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Handle: 6930537557732495
| |
| Processing POST data (application/json) (0 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer
| |
| {"janus":"message","body":{"request":"setup"},"transaction":"DQL62lpsIPOW"}
| |
| Forwarding request to the core (0x7fd45c002d70)
| |
| Got a Janus API request from janus.transport.http (0x7fd45c002d70)
| |
| Transport task pool, serving request
| |
| [6930537557732495] There's a message for JANUS TextRoom plugin
| |
| Creating plugin result...
| |
| Sending Janus API response to janus.transport.http (0x7fd45c002d70)
| |
| Got a Janus API response to send (0x7fd45c002d70)
| |
| Destroying plugin result...
| |
| [6930537557732495] Audio has NOT been negotiated
| |
| [6930537557732495] Video has NOT been negotiated
| |
| [6930537557732495] SCTP/DataChannels have NOT been negotiated
| |
| [6930537557732495] Setting ICE locally: got ANSWER (0 audios, 0 videos)
| |
| [6930537557732495] Creating ICE agent (ICE Full mode, controlling)
| |
| [6930537557732495] Adding 172.31.28.115 to the addresses to gather candidates for
| |
| [6930537557732495] Gathering done for stream 1
| |
| janus_dtls_bio_filter_ctrl: 6
| |
| -------------------------------------------
| |
| >> Anonymized
| |
| -------------------------------------------
| |
| [WARN] [6930537557732495] Skipping disabled/unsupported media line...
| |
| -------------------------------------------
| |
| >> Merged (193 bytes)
| |
| -------------------------------------------
| |
| v=0
| |
| o=- 1526081202248668 1 IN IP4 34.210.153.174
| |
| s=Janus TextRoom plugin
| |
| t=0 0
| |
| a=group:BUNDLE
| |
| a=msid-semantic: WMS janus
| |
| m=application 0 DTLS/SCTP 0
| |
| c=IN IP4 34.210.153.174
| |
| a=inactive
| |
| | |
| [6930537557732495] Sending event to transport...
| |
| Sending event to janus.transport.http (0x7fd43c007100)
| |
| Got a Janus API event to send (0x7fd43c007100)
| |
| >> Pushing event: 0 (took 368 us)
| |
| [6930537557732495] ICE thread started; 0x7fd458003df0
| |
| [ice.c:janus_ice_thread:2574] [6930537557732495] Looping (ICE)...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "event",
| |
| "session_id": 2542284235228595,
| |
| "transaction": "DQL62lpsIPOW",
| |
| "sender": 6930537557732495,
| |
| "plugindata": {
| |
| "plugin": "janus.plugin.textroom",
| |
| "data": {
| |
| "textroom": "event",
| |
| "result": "ok"
| |
| }
| |
| },
| |
| "jsep": {
| |
| "type": "offer",
| |
| "sdp": "v=0\r\no=- 1526081202248668 1 IN IP4 34.210.153.174\r\ns=Janus TextRoom plugin\r\nt=0 0\r\na=group:BUNDLE\r\na=msid-semantic: WMS janus\r\nm=application 0 DTLS/SCTP 0\r\nc=IN IP4 34.210.153.174\r\na=inactive\r\n"
| |
| }
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP POST request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] Content-Length: 310
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] content-type: application/json
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Handle: 6930537557732495
| |
| Processing POST data (application/json) (310 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1248] -- Data we have now (310 bytes)
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP POST request on /janus/2542284235228595/6930537557732495...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Handle: 6930537557732495
| |
| Processing POST data (application/json) (0 bytes)...
| |
| [transports/janus_http.c:janus_http_handler:1253] Done getting payload, we can answer
| |
| {"janus":"message","body":{"request":"ack"},"transaction":"rR1mPGNHkKYW","jsep":{"type":"answer","sdp":"v=0\r\no=- 5794779635134951790 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=msid-semantic: WMS\r\nm=application 0 DTLS/SCTP 5000\r\nc=IN IP4 0.0.0.0\r\na=mid:data\r\na=sctpmap:5000 webrtc-datachannel 1024\r\n"}}
| |
| Forwarding request to the core (0x7fd45c002d70)
| |
| Got a Janus API request from janus.transport.http (0x7fd45c002d70)
| |
| Transport task pool, serving request
| |
| [6930537557732495] There's a message for JANUS TextRoom plugin
| |
| [6930537557732495] Remote SDP:
| |
| v=0
| |
| o=- 5794779635134951790 2 IN IP4 127.0.0.1
| |
| s=-
| |
| t=0 0
| |
| a=msid-semantic: WMS
| |
| m=application 0 DTLS/SCTP 5000
| |
| c=IN IP4 0.0.0.0
| |
| a=mid:data
| |
| a=sctpmap:5000 webrtc-datachannel 1024
| |
| [6930537557732495] Audio has NOT been negotiated, Video has NOT been negotiated, SCTP/DataChannels have NOT been negotiated
| |
| [WARN] [6930537557732495] Skipping disabled/unsupported media line...
| |
| [ERR] [janus.c:janus_process_incoming_request:1193] Error processing SDP
| |
| [rR1mPGNHkKYW] Returning Janus API error 465 (Error processing SDP)
| |
| Got a Janus API response to send (0x7fd45c002d70)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| | |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| [file-live-sample] Rewind! (/opt/janus/share/janus/streams/radio.alaw)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| Long poll time out for session 2542284235228595...
| |
| We have a message to serve...
| |
| {
| |
| "janus": "keepalive"
| |
| }
| |
| [transports/janus_http.c:janus_http_handler:1137] Got a HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1138] ... Just parsing headers for now...
| |
| [transports/janus_http.c:janus_http_headers:1690] Host: jangouts.opensourceecology.org:8089
| |
| [transports/janus_http.c:janus_http_headers:1690] Connection: keep-alive
| |
| [transports/janus_http.c:janus_http_headers:1690] accept: application/json, text/plain, */*
| |
| [transports/janus_http.c:janus_http_headers:1690] Origin: https://jangouts.opensourceecology.org
| |
| [transports/janus_http.c:janus_http_headers:1690] User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36
| |
| [transports/janus_http.c:janus_http_headers:1690] Referer: https://jangouts.opensourceecology.org/textroomtest.html
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Encoding: gzip, deflate, sdch, br
| |
| [transports/janus_http.c:janus_http_headers:1690] Accept-Language: en-US,en;q=0.8
| |
| [transports/janus_http.c:janus_http_handler:1170] Processing HTTP GET request on /janus/2542284235228595...
| |
| [transports/janus_http.c:janus_http_handler:1223] ... parsing request...
| |
| Session: 2542284235228595
| |
| Got a Janus API request from janus.transport.http (0x7fd43c001c10)
| |
| Session 2542284235228595 found... returning up to 1 messages
| |
| [transports/janus_http.c:janus_http_notifier:1723] ... handling long poll...
| |
| Got a keep-alive on session 2542284235228595
| |
| Sending Janus API response to janus.transport.http (0x7fd43c001c10)
| |
| Got a Janus API response to send (0x7fd43c001c10)
| |
| </pre>
| |
| # the description of the "Text Room" plugin says "A text room demo, using DataChannels only."
| |
| # checking the output above, I see that "data-channels" is listed as "false" in the "flags" section
| |
| | |
| =Thr May 10, 2018=
| |
| # attempted to install jangouts on our ec2 instance where the janus gateway is now properly installed
| |
| <pre>
| |
| wget https://github.com/jangouts/jangouts/archive/v0.4.7.tar.gz
| |
| tar -xzvf v0.4.7.tar.gz
| |
| rsync -av jangouts-0.4.7/dist /var/www/html/jangouts.opensourceecology.org/htdocs/jangouts
| |
| </pre>
| |
| # at first, clicking the login button did nothing. there was no room list. eventually, a timeout appeared. the fix here was to set "janusServerSSL" to "https://jangouts.opensourceecology.org:8089/janus"
| |
| # I connected 3 clients. it ran slow, but it worked.
| |
| # what was not working, however, was the text box. When I typed a message & pressed 'Send', I saw "Data channel not open yet. Skipping" in the developer console
| |
| # I enabled "janusDebug" in the jangouts config file, but--while the initial connection was much, much more verbose in the dev console, the message that would pop-up in the console when I sent a text message would be the same "Data channel not open yet. Skipping."
| |
| # in fact, the janus demo "text room" doesn't work either. so if I fix this in janus it would probably work in jangouts http://jangouts.opensourceecology.org/textroomtest.html
| |
| ## both the console on the browser and the server produce the error "Error processing SDP" when attemtping to load the janus text room demo
| |
| # I found a line that mentions SDP in the main confi file
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus]# grep -ir 'sdp' *
| |
| janus.cfg:;interface = 1.2.3.4 ; Interface to use (will be used in SDP)
| |
| janus.cfg:; candidates from users, but sends its own within the SDP), and whether
| |
| janus.cfg.orig:;interface = 1.2.3.4 ; Interface to use (will be used in SDP)
| |
| janus.cfg.orig:; candidates from users, but sends its own within the SDP), and whether
| |
| janus.cfg.sample:;interface = 1.2.3.4 ; Interface to use (will be used in SDP)
| |
| janus.cfg.sample:; candidates from users, but sends its own within the SDP), and whether
| |
| janus.plugin.streaming.cfg:; SDP rtpmap and fmtp attributes the remote camera or RTSP server sent.
| |
| janus.plugin.streaming.cfg.sample:; SDP rtpmap and fmtp attributes the remote camera or RTSP server sent.
| |
| [root@ip-172-31-28-115 janus]#
| |
| </pre>
| |
| # Marcin just forwarded me an email from dreamhost that suggested our account had been hacked. The site they're referring to is a drupal site that I didn't know we had. Drupal recently released patches for critical vulnerabilities, so that makes sense. I wouldn't generally be concerned about this, but our backups (which contains many of our config files with our passwords in them) are stored on this same server--albeit in a different account in a different user's home directory.
| |
| ## what I'd *like* to do is immediately block all non-port-22 access to this server and start digging into the logs and open connections. Unfortunately, this is a shared hosting server, and I don't have root.
| |
| ## instead, I'll just update our backup scripts to actually encrypt them before sending to dreamhost (which is what I've implemented for our process to upload our backups to glacier)
| |
| # I dug though the dreamhost dashboard & couldn't find any firewall settings
| |
| # I confirmed that the permissions on our backup data dir isn't tight enough
| |
| <pre>
| |
| hancock% ls -lah hetzner1 hetzner2
| |
| hetzner1:
| |
| total 8.0K
| |
| drwxr-xr-x 10 marcin_ose pg1589252 4.0K May 10 01:21 .
| |
| drwx--x--- 26 marcin_ose adm 4.0K May 10 14:42 ..
| |
| drwxr-xr-x 3 marcin_ose pg1589252 32 May 5 22:20 20180501-052002
| |
| drwxr-xr-x 4 marcin_ose pg1589252 45 May 6 22:20 20180502-052001
| |
| drwxr-xr-x 2 marcin_ose pg1589252 10 May 9 22:20 20180505-052001
| |
| drwxr-xr-x 4 marcin_ose pg1589252 52 May 9 22:20 20180506-052001
| |
| drwxr-xr-x 5 marcin_ose pg1589252 67 May 6 22:30 20180507-052001
| |
| drwxr-xr-x 5 marcin_ose pg1589252 67 May 7 22:30 20180508-052001
| |
| drwxr-xr-x 5 marcin_ose pg1589252 67 May 8 22:30 20180509-052001
| |
| drwxr-xr-x 5 marcin_ose pg1589252 67 May 9 22:30 20180510-052001
| |
| | |
| hetzner2:
| |
| total 8.0K
| |
| drwxr-xr-x 9 marcin_ose pg1589252 4.0K May 10 00:31 .
| |
| drwx--x--- 26 marcin_ose adm 4.0K May 10 14:42 ..
| |
| drwxr-xr-x 8 marcin_ose pg1589252 102 May 1 00:21 20180501-072001
| |
| drwxr-xr-x 2 marcin_ose pg1589252 10 May 9 22:20 20180505-072001
| |
| drwxr-xr-x 8 marcin_ose pg1589252 102 May 6 00:21 20180506-072001
| |
| drwxr-xr-x 8 marcin_ose pg1589252 102 May 7 00:21 20180507-072001
| |
| drwxr-xr-x 8 marcin_ose pg1589252 102 May 8 00:21 20180508-072001
| |
| drwxr-xr-x 8 marcin_ose pg1589252 102 May 9 00:21 20180509-072001
| |
| drwxr-xr-x 8 marcin_ose pg1589252 102 May 10 00:21 20180510-072001
| |
| hancock%
| |
| </pre>
| |
| ## digging in the dreamhost panel shows that this 'pg1589252' group contains all our users = 'marcin_ose, ose_site, ose_community, osecolby, osebackup'
| |
| ## I can't set it to 'marcin_ose:marcin_ose as there is no group 'marcin_ose'
| |
| ## I can't set the group ownership to 'root' as I don't have permission to do that
| |
| ## there's an 'adm' group, but that has a user 'dhapache' in it
| |
| ## so I think we have to leave the group as-is and just make group & other all 0s
| |
| <pre>
| |
| hancock% chmod -R 0700 hetzner1
| |
| hancock% chmod -R 0700 hetzner2
| |
| hancock% find hetzner1 -type f -exec chmod 0600 {} \;
| |
| hancock% find hetzner2 -type f -exec chmod 0600 {} \;
| |
| </pre>
| |
| # and confirmation of the new permissions looks good
| |
| <pre>
| |
| hancock% ls -lah hetzner1 hetzner2
| |
| hetzner1:
| |
| total 8.0K
| |
| drwx------ 10 marcin_ose pg1589252 4.0K May 10 01:21 .
| |
| drwx--x--- 26 marcin_ose adm 4.0K May 10 19:27 ..
| |
| drwx------ 3 marcin_ose pg1589252 32 May 5 22:20 20180501-052002
| |
| drwx------ 4 marcin_ose pg1589252 45 May 6 22:20 20180502-052001
| |
| drwx------ 2 marcin_ose pg1589252 10 May 9 22:20 20180505-052001
| |
| drwx------ 4 marcin_ose pg1589252 52 May 9 22:20 20180506-052001
| |
| drwx------ 5 marcin_ose pg1589252 67 May 6 22:30 20180507-052001
| |
| drwx------ 5 marcin_ose pg1589252 67 May 7 22:30 20180508-052001
| |
| drwx------ 5 marcin_ose pg1589252 67 May 8 22:30 20180509-052001
| |
| drwx------ 5 marcin_ose pg1589252 67 May 9 22:30 20180510-052001
| |
| | |
| hetzner2:
| |
| total 8.0K
| |
| drwx------ 9 marcin_ose pg1589252 4.0K May 10 00:31 .
| |
| drwx--x--- 26 marcin_ose adm 4.0K May 10 19:27 ..
| |
| drwx------ 8 marcin_ose pg1589252 102 May 1 00:21 20180501-072001
| |
| drwx------ 2 marcin_ose pg1589252 10 May 9 22:20 20180505-072001
| |
| drwx------ 8 marcin_ose pg1589252 102 May 6 00:21 20180506-072001
| |
| drwx------ 8 marcin_ose pg1589252 102 May 7 00:21 20180507-072001
| |
| drwx------ 8 marcin_ose pg1589252 102 May 8 00:21 20180508-072001
| |
| drwx------ 8 marcin_ose pg1589252 102 May 9 00:21 20180509-072001
| |
| drwx------ 8 marcin_ose pg1589252 102 May 10 00:21 20180510-072001
| |
| hancock%
| |
| </pre>
| |
| # I reset the password of the "ose_community" user from the dreamhost dashboard. Max characters was 31 though >:\ I put it in our keepass
| |
| # the closest thing I could find to iptables blocking or stopping the service was to go to the dreamhost panel and click "Domains" > "Manage Domains". For each of the sites, I clicked "Remove" under the "Web Hosting" section. This doesn't appear to delete files, just clear out the vhost that makes the folder public (both for dns & ip). I did his for 'openfarmtech.org', 'dhblog.openfarmtech.org', 'dreamhost.openfarmtech.org', 'opensourceecology.org', 'blog.opensourceecology.org', 'community.opensourceecology.org', 'eerik.opensourceecology.org', 'forum.opensourceecology.org'
| |
| ## there's also 'civicrm.opensourceecology.org', but it was already listed as 'none'
| |
| ## after this change, I confirmed that the the sites went down (curl responds with a timeout)
| |
| <pre>
| |
| user@ose:~$ curl 208.113.185.71
| |
| curl: (7) Failed to connect to 208.113.185.71 port 80: Connection timed out
| |
| user@ose:~$
| |
| </pre>
| |
| # moreover, nmap doesn't show port 80 anymore
| |
| <pre>
| |
| user@personal:~$ nmap -Pn 208.113.185.71
| |
| | |
| Starting Nmap 6.47 ( http://nmap.org ) at 2018-05-10 23:00 EDT
| |
| Nmap scan report for openfarmtech.org (208.113.185.71)
| |
| Host is up (0.037s latency).
| |
| Not shown: 995 filtered ports
| |
| PORT STATE SERVICE
| |
| 21/tcp open ftp
| |
| 22/tcp open ssh
| |
| 587/tcp open submission
| |
| 5222/tcp open xmpp-client
| |
| 5269/tcp open xmpp-server
| |
| | |
| Nmap done: 1 IP address (1 host up) scanned in 11.71 seconds
| |
| user@personal:~$
| |
| </pre>
| |
| | |
| =Wed May 09, 2018=
| |
| # updated Jitsi
| |
| # documented Janus
| |
| # enabled the janus admin api via /opt/janus/etc/janus/janus.transport.http.cfg
| |
| # opened port 7088-7089 on the videoconf-dev security group in the aws console
| |
| # apparently the https version uses a different port, so I enabled port 7889 as well; this worked
| |
| <pre>
| |
| user@ose:~$ curl -k https://jangouts.opensourceecology.org:7889/admin
| |
| {
| |
| "janus": "error",
| |
| "error": {
| |
| "code": 454,
| |
| "reason": "Request payload missing"
| |
| }
| |
| user@ose:~$
| |
| </pre>
| |
| # got some actual data from the API!
| |
| <pre>
| |
| user@ose:~$ curl -k -X POST -d '{"janus": "list_sessions", "transaction": "123", "admin_secre": "janusoverlord"}' https://jangouts.opensourceecology.org:7889/admin
| |
| {
| |
| "janus": "success",
| |
| "transaction": "123",
| |
| "sessions": [
| |
| 6109314374460183
| |
| ]
| |
| }user@ose:~$
| |
| </pre>
| |
| # and I was able to get a list of handles for the given session (note that the session is passed by the URL, not a variable)
| |
| <pre>
| |
| user@ose:~$ curl -k -X POST -d '{"janus": "list_handles", transaction": "123", "admin_secret": "janusoverlord"}' https://jangouts.opensourceecology.org:7889/admin/6109314374460183
| |
| {
| |
| "janus": "success",
| |
| "session_id": 6109314374460183,
| |
| "transaction": "123",
| |
| "handles": [
| |
| 8526612291744095
| |
| ]
| |
| }user@ose:~$
| |
| </pre>
| |
| # and I was able to get the info on this handle
| |
| <pre>
| |
| user@ose:~$ curl -k -X POST -d '{"janus": "handle_info", "transaction": "123", "admin_secret": "janusoverlord"}' "https://jangouts.opensourceecology.org:7889/admin/6109314374460183/8526612291744095"
| |
| {
| |
| "janus": "success",
| |
| "session_id": 6109314374460183,
| |
| "transaction": "123",
| |
| "handle_id": 8526612291744095,
| |
| "info": {
| |
| "session_id": 6109314374460183,
| |
| "session_last_activity": 1650443053811,
| |
| "session_transport": "janus.transport.http",
| |
| "handle_id": 8526612291744095,
| |
| "opaque_id": "videoroomtest-uAZvzmILiCiL",
| |
| "created": 1650028494963,
| |
| "send_thread_created": false,
| |
| "current_time": 1650464664821,
| |
| "plugin": "janus.plugin.videoroom",
| |
| "plugin_specific": {
| |
| "type": "publisher",
| |
| "room": 1234,
| |
| "id": 7450106407003965,
| |
| "private_id": 759484767,
| |
| "display": "tester",
| |
| "media": {
| |
| "audio": true,
| |
| "audio_codec": "opus",
| |
| "video": true,
| |
| "video_codec": "vp8",
| |
| "data": false
| |
| },
| |
| "bitrate": 128000,
| |
| "audio-level-dBov": 0,
| |
| "talking": false,
| |
| "hangingup": 1,
| |
| "destroyed": 0
| |
| },
| |
| "flags": {
| |
| "got-offer": true,
| |
| "got-answer": true,
| |
| "processing-offer": false,
| |
| "starting": true,
| |
| "ice-restart": false,
| |
| "ready": false,
| |
| "stopped": false,
| |
| "alert": true,
| |
| "trickle": true,
| |
| "all-trickles": true,
| |
| "resend-trickles": false,
| |
| "trickle-synced": false,
| |
| "data-channels": false,
| |
| "has-audio": true,
| |
| "has-video": true,
| |
| "rfc4588-rtx": false,
| |
| "cleaning": false
| |
| },
| |
| "sdps": {},
| |
| "queued-packets": 0,
| |
| "streams": []
| |
| }
| |
| user@ose:~$
| |
| </pre>
| |
| # I clicked "publish" in the video room plugin on my browser running on our server. My webcam light lit-up, and I re-ran the above command
| |
| <pre>
| |
| user@ose:~$ curl -k -X POST -d '{"janus": "handle_info", "transaction": "123", "admin_secret": "janusoverlord"}' "https://jangouts.opensourceecology.org:7889/admin/6109314374460183/8526612291744095"
| |
| {
| |
| "janus": "success",
| |
| "session_id": 6109314374460183,
| |
| "transaction": "123",
| |
| "handle_id": 8526612291744095,
| |
| "info": {
| |
| "session_id": 6109314374460183,
| |
| "session_last_activity": 1650508989240,
| |
| "session_transport": "janus.transport.http",
| |
| "handle_id": 8526612291744095,
| |
| "opaque_id": "videoroomtest-uAZvzmILiCiL",
| |
| "created": 1650028494963,
| |
| "send_thread_created": false,
| |
| "current_time": 1650511883011,
| |
| "plugin": "janus.plugin.videoroom",
| |
| "plugin_specific": {
| |
| "type": "publisher",
| |
| "room": 1234,
| |
| "id": 7450106407003965,
| |
| "private_id": 759484767,
| |
| "display": "tester",
| |
| "media": {
| |
| "audio": true,
| |
| "audio_codec": "opus",
| |
| "video": true,
| |
| "video_codec": "vp8",
| |
| "data": false
| |
| },
| |
| "bitrate": 128000,
| |
| "audio-level-dBov": 0,
| |
| "talking": false,
| |
| "hangingup": 0,
| |
| "destroyed": 0
| |
| },
| |
| "flags": {
| |
| "got-offer": true,
| |
| "got-answer": true,
| |
| "processing-offer": false,
| |
| "starting": true,
| |
| "ice-restart": false,
| |
| "ready": false,
| |
| "stopped": false,
| |
| "alert": false,
| |
| "trickle": true,
| |
| "all-trickles": true,
| |
| "resend-trickles": false,
| |
| "trickle-synced": false,
| |
| "data-channels": false,
| |
| "has-audio": true,
| |
| "has-video": true,
| |
| "rfc4588-rtx": false,
| |
| "cleaning": false
| |
| },
| |
| "agent-created": 1650508599901,
| |
| "ice-mode": "full",
| |
| "ice-role": "controlled",
| |
| "sdps": {
| |
| "profile": "UDP/TLS/RTP/SAVPF",
| |
| "local": "v=0\r\no=- 1448085342088977254 2 IN IP4 172.31.28.115\r\ns=VideoRoom 1234\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS janus\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111\r\nc=IN IP4 172.31.28.115\r\na=recvonly\r\na=mid:audio\r\na=rtcp-mux\r\na=ice-ufrag:s/cD\r\na=ice-pwd:GXXcK8+fezHFx35asoFADe\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:111 opus/48000/2\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=candidate:1 1 udp 2013266431 172.31.28.115 39912 typ host\r\na=end-of-candidates\r\nm=video 9 UDP/TLS/RTP/SAVPF 96\r\nc=IN IP4 172.31.28.115\r\na=recvonly\r\na=mid:video\r\na=rtcp-mux\r\na=ice-ufrag:s/cD\r\na=ice-pwd:GXXcK8+fezHFx35asoFADe\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=candidate:1 1 udp 2013266431 172.31.28.115 39912 typ host\r\na=end-of-candidates\r\n",
| |
| "remote": "v=0\r\no=- 1448085342088977254 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 110 112 113 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:acE8\r\na=ice-pwd:QaTF0gUTVVtelbkhtd95f+Uf\r\na=fingerprint:sha-256 CE:80:6F:A9:94:21:10:C7:F1:15:38:3D:2A:D2:DC:13:1C:CB:D0:D9:FC:12:C3:87:A7:CB:E4:C6:AC:DC:E7:E9\r\na=setup:actpass\r\na=mid:audio\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=sendonly\r\na=rtcp-mux\r\na=rtpmap:111 opus/48000/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:103 ISAC/16000\r\na=rtpmap:104 ISAC/32000\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:106 CN/32000\r\na=rtpmap:105 CN/16000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:112 telephone-event/32000\r\na=rtpmap:113 telephone-event/16000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:3970216167 cname:GAZb0NrxUOm0Gr\r\na=ssrc:3970216167 msid:DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ 573226b4-f273-4b18-bcbf-14c5ea70c30b\r\na=ssrc:3970216167 mslabel:DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ\r\na=ssrc:3970216167 label:573226b4-f273-4b18-bcbf-14c5ea70c30b\r\nm=video 9 UDP/TLS/RTP/SAVPF 96 98 100 102 127 97 99 101 125\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:acE8\r\na=ice-pwd:QaTF0gUTVVtelbkhtd95f+Uf\r\na=fingerprint:sha-256 CE:80:6F:A9:94:21:10:C7:F1:15:38:3D:2A:D2:DC:13:1C:CB:D0:D9:FC:12:C3:87:A7:CB:E4:C6:AC:DC:E7:E9\r\na=setup:actpass\r\na=mid:video\r\na=extmap:2 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:3 http:www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=sendonly\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtpmap:98 VP9/90000\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtpmap:100 H264/90000\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=fmtp:100 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f\r\na=rtpmap:102 red/90000\r\na=rtpmap:127 ulpfec/90000\r\na=rtpmap:97 rtx/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:99 rtx/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:101 rtx/90000\r\na=fmtp:101 apt=100\r\na=rtpmap:125 rtx/90000\r\na=fmtp:125 apt=102\r\na=ssrc-group:FID 928459620 3579834730\r\na=ssrc:928459620 cname:GAZb0NrxUOm0Gr\r\na=ssrc:928459620 msid:DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ 5c33de72-77df-4524-90ca-0cc1539523d9\r\na=ssrc:928459620 mslabel:DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ\r\na=ssrc:928459620 label:5c33de72-77df-4524-90ca-0cc1539523d9\r\na=ssrc:3579834730 cname:GAZb0NrxUOm0Gr\r\na=ssrc:3579834730 msid:DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ 5c33de72-77df-4524-90ca-0cc1539523d9\r\na=ssrc:3579834730 mslabel:DMHJt4gzQBKEslKuTly54dUGS8CUpXTy7hpJ\r\na=ssrc:3579834730 label:5c33de72-77df-4524-90ca-0cc1539523d9\r\n"
| |
| },
| |
| "queued-packets": 0,
| |
| "streams": [
| |
| {
| |
| "id": 1,
| |
| "ready": -1,
| |
| "ssrc": {
| |
| "audio": 1695918016,
| |
| "video": 1878586123,
| |
| "audio-peer": 3970216167,
| |
| "video-peer": 928459620,
| |
| "video-peer-rtx": 3579834730
| |
| },
| |
| "direction": {
| |
| "audio-send": false,
| |
| "audio-recv": true,
| |
| "video-send": false,
| |
| "video-recv": true
| |
| },
| |
| "rtcp_stats": {
| |
| "audio": {
| |
| "base": 48000,
| |
| "rtt": 0,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 0,
| |
| "jitter-remote": 0,
| |
| "in-link-quality": 0,
| |
| "in-media-link-quality": 0,
| |
| "out-link-quality": 0,
| |
| "out-media-link-quality": 0
| |
| },
| |
| "video": {
| |
| "base": 90000,
| |
| "rtt": 0,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 0,
| |
| "jitter-remote": 0,
| |
| "in-link-quality": 0,
| |
| "in-media-link-quality": 0,
| |
| "out-link-quality": 0,
| |
| "out-media-link-quality": 0
| |
| }
| |
| },
| |
| "components": [
| |
| {
| |
| "id": 1,
| |
| "state": "connecting",
| |
| "local-candidates": [
| |
| "1 1 udp 2013266431 172.31.28.115 39912 typ host"
| |
| ],
| |
| "remote-candidates": [
| |
| "2774440166 1 udp 1686052607 76.97.XXX.YYY 52120 typ srflx raddr 10.137.2.17 rport 52120 generation 0 ufrag acE8 network-id 1 network-cost 50",
| |
| "201398067 1 udp 2122260223 10.137.2.17 52120 typ host generation 0 ufrag acE8 network-id 1 network-cost 50"
| |
| ],
| |
| "dtls": {
| |
| "fingerprint": "D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38",
| |
| "remote-fingerprint": "CE:80:6F:A9:94:21:10:C7:F1:15:38:3D:2A:D2:DC:13:1C:CB:D0:D9:FC:12:C3:87:A7:CB:E4:C6:AC:DC:E7:E9",
| |
| "remote-fingerprint-hash": "sha-256",
| |
| "dtls-role": "active",
| |
| "dtls-state": "created",
| |
| "retransmissions": 0,
| |
| "valid": false,
| |
| "ready": false
| |
| },
| |
| "in_stats": {
| |
| "audio_packets": 0,
| |
| "audio_bytes": 0,
| |
| "audio_bytes_lastsec": 0,
| |
| "do_audio_nacks": false,
| |
| "video_packets": 0,
| |
| "video_bytes": 0,
| |
| "video_bytes_lastsec": 0,
| |
| "do_video_nacks": true,
| |
| "video_nacks": 0,
| |
| "data_packets": 0,
| |
| "data_bytes": 0
| |
| },
| |
| "out_stats": {
| |
| "audio_packets": 0,
| |
| "audio_bytes": 0,
| |
| "audio_bytes_lastsec": 0,
| |
| "audio_nacks": 0,
| |
| "video_packets": 0,
| |
| "video_bytes": 0,
| |
| "video_bytes_lastsec": 0,
| |
| "video_nacks": 0,
| |
| "data_packets": 0,
| |
| "data_bytes": 0
| |
| }
| |
| }
| |
| ]
| |
| }
| |
| ]
| |
| }
| |
| }user@ose:~$
| |
| </pre>
| |
| # looks like in the demos menu already contains a html page for accessing this info without having to query it with curl @ "Demos" > "Admin/Monitor" https://jangouts.opensourceecology.org/admin.html
| |
| # this is much easier to refresh, so I was able to extract more info at the right time
| |
| <pre>
| |
| {
| |
| "session_id": 6109314374460183,
| |
| "session_last_activity": 1651631733041,
| |
| "session_transport": "janus.transport.http",
| |
| "handle_id": 8526612291744095,
| |
| "opaque_id": "videoroomtest-uAZvzmILiCiL",
| |
| "created": 1650028494963,
| |
| "send_thread_created": false,
| |
| "current_time": 1651642194591,
| |
| "plugin": "janus.plugin.videoroom",
| |
| "plugin_specific": {
| |
| "type": "publisher",
| |
| "room": 1234,
| |
| "id": 7450106407003965,
| |
| "private_id": 759484767,
| |
| "display": "tester",
| |
| "media": {
| |
| "audio": true,
| |
| "audio_codec": "opus",
| |
| "video": true,
| |
| "video_codec": "vp8",
| |
| "data": false
| |
| },
| |
| "bitrate": 128000,
| |
| "audio-level-dBov": 0,
| |
| "talking": false,
| |
| "hangingup": 0,
| |
| "destroyed": 0
| |
| },
| |
| "flags": {
| |
| "got-offer": true,
| |
| "got-answer": true,
| |
| "processing-offer": false,
| |
| "starting": true,
| |
| "ice-restart": false,
| |
| "ready": false,
| |
| "stopped": false,
| |
| "alert": false,
| |
| "trickle": true,
| |
| "all-trickles": true,
| |
| "resend-trickles": false,
| |
| "trickle-synced": false,
| |
| "data-channels": false,
| |
| "has-audio": true,
| |
| "has-video": true,
| |
| "rfc4588-rtx": false,
| |
| "cleaning": false
| |
| },
| |
| "agent-created": 1651631274943,
| |
| "ice-mode": "full",
| |
| "ice-role": "controlled",
| |
| "sdps": {
| |
| "profile": "UDP/TLS/RTP/SAVPF",
| |
| "local": "v=0\r\no=- 3310046283117293242 2 IN IP4 172.31.28.115\r\ns=VideoRoom 1234\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS janus\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111\r\nc=IN IP4 172.31.28.115\r\na=recvonly\r\na=mid:audio\r\na=rtcp-mux\r\na=ice-ufrag:jJJ2\r\na=ice-pwd:FEMURHPpKGoFlvj6FWtTmn\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:111 opus/48000/2\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=candidate:1 1 udp 2013266431 172.31.28.115 58674 typ host\r\na=end-of-candidates\r\nm=video 9 UDP/TLS/RTP/SAVPF 96\r\nc=IN IP4 172.31.28.115\r\na=recvonly\r\na=mid:video\r\na=rtcp-mux\r\na=ice-ufrag:jJJ2\r\na=ice-pwd:FEMURHPpKGoFlvj6FWtTmn\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=candidate:1 1 udp 2013266431 172.31.28.115 58674 typ host\r\na=end-of-candidates\r\n",
| |
| "remote": "v=0\r\no=- 3310046283117293242 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 110 112 113 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:9l+1\r\na=ice-pwd:lDIWJ39vbqulMzrUpRXyJIi9\r\na=fingerprint:sha-256 19:15:2B:C6:7B:D7:A4:8E:A0:7F:45:6A:5A:65:54:77:D3:A0:8E:F9:09:85:3D:49:ED:AF:CE:C5:D5:FF:06:36\r\na=setup:actpass\r\na=mid:audio\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=sendonly\r\na=rtcp-mux\r\na=rtpmap:111 opus/48000/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:103 ISAC/16000\r\na=rtpmap:104 ISAC/32000\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:106 CN/32000\r\na=rtpmap:105 CN/16000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:112 telephone-event/32000\r\na=rtpmap:113 telephone-event/16000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:3122267103 cname:/z3txAJMnmS+uJmM\r\na=ssrc:3122267103 msid:QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v f8d72237-8d1a-4342-86d6-dd96ce507513\r\na=ssrc:3122267103 mslabel:QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v\r\na=ssrc:3122267103 label:f8d72237-8d1a-4342-86d6-dd96ce507513\r\nm=video 9 UDP/TLS/RTP/SAVPF 96 98 100 102 127 97 99 101 125\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:9l+1\r\na=ice-pwd:lDIWJ39vbqulMzrUpRXyJIi9\r\na=fingerprint:sha-256 19:15:2B:C6:7B:D7:A4:8E:A0:7F:45:6A:5A:65:54:77:D3:A0:8E:F9:09:85:3D:49:ED:AF:CE:C5:D5:FF:06:36\r\na=setup:actpass\r\na=mid:video\r\na=extmap:2 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=sendonly\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtpmap:98 VP9/90000\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtpmap:100 H264/90000\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=fmtp:100 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f\r\na=rtpmap:102 red/90000\r\na=rtpmap:127 ulpfec/90000\r\na=rtpmap:97 rtx/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:99 rtx/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:101 rtx/90000\r\na=fmtp:101 apt=100\r\na=rtpmap:125 rtx/90000\r\na=fmtp:125 apt=102\r\na=ssrc-group:FID 3519146384 830159624\r\na=ssrc:3519146384 cname:/z3txAJMnmS+uJmM\r\na=ssrc:3519146384 msid:QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v 8da2323f-739e-437a-b525-2f141888ca11\r\na=ssrc:3519146384 mslabel:QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v\r\na=ssrc:3519146384 label:8da2323f-739e-437a-b525-2f141888ca11\r\na=ssrc:830159624 cname:/z3txAJMnmS+uJmM\r\na=ssrc:830159624 msid:QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v 8da2323f-739e-437a-b525-2f141888ca11\r\na=ssrc:830159624 mslabel:QIHUJyUcIX0PP8kBotpf0dm1jT7wdwshp52v\r\na=ssrc:830159624 label:8da2323f-739e-437a-b525-2f141888ca11\r\n"
| |
| },
| |
| "queued-packets": 0,
| |
| "streams": [
| |
| {
| |
| "id": 1,
| |
| "ready": -1,
| |
| "ssrc": {
| |
| "audio": 1320172211,
| |
| "video": 3686210926,
| |
| "audio-peer": 3122267103,
| |
| "video-peer": 3519146384,
| |
| "video-peer-rtx": 830159624
| |
| },
| |
| "direction": {
| |
| "audio-send": false,
| |
| "audio-recv": true,
| |
| "video-send": false,
| |
| "video-recv": true
| |
| },
| |
| "rtcp_stats": {
| |
| "audio": {
| |
| "base": 48000,
| |
| "rtt": 0,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 0,
| |
| "jitter-remote": 0,
| |
| "in-link-quality": 0,
| |
| "in-media-link-quality": 0,
| |
| "out-link-quality": 0,
| |
| "out-media-link-quality": 0
| |
| },
| |
| "video": {
| |
| "base": 90000,
| |
| "rtt": 0,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 0,
| |
| "jitter-remote": 0,
| |
| "in-link-quality": 0,
| |
| "in-media-link-quality": 0,
| |
| "out-link-quality": 0,
| |
| "out-media-link-quality": 0
| |
| }
| |
| },
| |
| "components": [
| |
| {
| |
| "id": 1,
| |
| "state": "failed",
| |
| "failed-detected": 1651640475831,
| |
| "icetimer-started": true,
| |
| "local-candidates": [
| |
| "1 1 udp 2013266431 172.31.28.115 58674 typ host"
| |
| ],
| |
| "remote-candidates": [
| |
| "201398067 1 udp 2122260223 10.137.2.17 60985 typ host generation 0 ufrag 9l+1 network-id 1 network-cost 50",
| |
| "2774440166 1 udp 1686052607 76.97.XXX.YYY 60985 typ srflx raddr 10.137.2.17 rport 60985 generation 0 ufrag 9l+1 network-id 1 network-cost 50"
| |
| ],
| |
| "dtls": {
| |
| "fingerprint": "D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38",
| |
| "remote-fingerprint": "19:15:2B:C6:7B:D7:A4:8E:A0:7F:45:6A:5A:65:54:77:D3:A0:8E:F9:09:85:3D:49:ED:AF:CE:C5:D5:FF:06:36",
| |
| "remote-fingerprint-hash": "sha-256",
| |
| "dtls-role": "active",
| |
| "dtls-state": "created",
| |
| "retransmissions": 0,
| |
| "valid": false,
| |
| "ready": false
| |
| },
| |
| "in_stats": {
| |
| "audio_packets": 0,
| |
| "audio_bytes": 0,
| |
| "audio_bytes_lastsec": 0,
| |
| "do_audio_nacks": false,
| |
| "video_packets": 0,
| |
| "video_bytes": 0,
| |
| "video_bytes_lastsec": 0,
| |
| "do_video_nacks": true,
| |
| "video_nacks": 0,
| |
| "data_packets": 0,
| |
| "data_bytes": 0
| |
| },
| |
| "out_stats": {
| |
| "audio_packets": 0,
| |
| "audio_bytes": 0,
| |
| "audio_bytes_lastsec": 0,
| |
| "audio_nacks": 0,
| |
| "video_packets": 0,
| |
| "video_bytes": 0,
| |
| "video_bytes_lastsec": 0,
| |
| "video_nacks": 0,
| |
| "data_packets": 0,
| |
| "data_bytes": 0
| |
| }
| |
| }
| |
| ]
| |
| }
| |
| ]
| |
| }
| |
| </pre>
| |
| # so the above output shows us that the ICE connection was "failed". It does show the aws private ip address of the 'local-candidates' section and both the private and public ip addresses of my laptop in the 'remote-candidates' section. My best guess is that the issue is that the aws ip is private; the actual public ip is 34.210.153.174, but that's even visible on the node
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# ip -a a
| |
| 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1
| |
| link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
| |
| inet 127.0.0.1/8 scope host lo
| |
| valid_lft forever preferred_lft forever
| |
| inet6 ::1/128 scope host
| |
| valid_lft forever preferred_lft forever
| |
| 2: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 9001 qdisc pfifo_fast state UP group default qlen 1000
| |
| link/ether 02:58:0c:bc:ab:50 brd ff:ff:ff:ff:ff:ff
| |
| inet 172.31.28.115/20 brd 172.31.31.255 scope global dynamic eth0
| |
| valid_lft 3586sec preferred_lft 3586sec
| |
| inet6 fe80::58:cff:febc:ab50/64 scope link
| |
| valid_lft forever preferred_lft forever
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| [root@ip-172-31-28-115 htdocs]# dig +short jangouts.opensourceecology.org
| |
| 34.210.153.174
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # a bit of googling suggests that you can't actually bind to the public ip address inside the ec2 cloud, but that it's a 1:1 NAT mapping, so that shouldn't be an issue. Digging through the config files, I did find that AWS is specifically mentioned for the 'nat_1_1_mapping' variable
| |
| <blockquote>
| |
| ; In case you're deploying Janus on a server which is configured with
| |
| ; a 1:1 NAT (e.g., Amazon EC2), you might want to also specify the public
| |
| ; address of the machine using the setting below. This will result in
| |
| ; all host candidates (which normally have a private IP address) to
| |
| ; be rewritten with the public address provided in the settings. As
| |
| ; such, use the option with caution and only if you know what you're doing.
| |
| ; Besides, it's still recommended to also enable STUN in those cases,
| |
| ; and keep ICE Lite disabled as it's not strictly speaking a public server.
| |
| ;nat_1_1_mapping = 1.2.3.4
| |
| </blockquote>
| |
| # I changed this to the actual public ip, and restarted janus
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus]# cp janus.cfg janus.cfg.orig
| |
| [root@ip-172-31-28-115 janus]# vim janus.cfg
| |
| [root@ip-172-31-28-115 janus]# diff janus.cfg.orig janus.cfg
| |
| 132a133
| |
| > nat_1_1_mapping = 34.210.153.174
| |
| </pre>
| |
| # that did it! the demos are working as expected now!! this is how it should look during an echo test
| |
| <pre>
| |
| {
| |
| "session_id": 8969403825798607,
| |
| "session_last_activity": 1668174556992,
| |
| "session_transport": "janus.transport.http",
| |
| "handle_id": 5189677910802057,
| |
| "opaque_id": "echotest-MLVUW6wG3cOM",
| |
| "created": 1668169108013,
| |
| "send_thread_created": true,
| |
| "current_time": 1668184930779,
| |
| "plugin": "janus.plugin.echotest",
| |
| "plugin_specific": {
| |
| "audio_active": true,
| |
| "video_active": true,
| |
| "audio_codec": "opus",
| |
| "video_codec": "vp8",
| |
| "bitrate": 0,
| |
| "peer-bitrate": 0,
| |
| "slowlink_count": 0,
| |
| "hangingup": 0,
| |
| "destroyed": 0
| |
| },
| |
| "flags": {
| |
| "got-offer": true,
| |
| "got-answer": true,
| |
| "processing-offer": false,
| |
| "starting": true,
| |
| "ice-restart": false,
| |
| "ready": true,
| |
| "stopped": false,
| |
| "alert": false,
| |
| "trickle": true,
| |
| "all-trickles": true,
| |
| "resend-trickles": false,
| |
| "trickle-synced": false,
| |
| "data-channels": false,
| |
| "has-audio": true,
| |
| "has-video": true,
| |
| "rfc4588-rtx": false,
| |
| "cleaning": false
| |
| },
| |
| "agent-created": 1668171794627,
| |
| "ice-mode": "full",
| |
| "ice-role": "controlled",
| |
| "sdps": {
| |
| "profile": "UDP/TLS/RTP/SAVPF",
| |
| "local": "v=0\r\no=- 3511923798191890868 2 IN IP4 34.210.153.174\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS janus\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111\r\nc=IN IP4 34.210.153.174\r\na=sendrecv\r\na=mid:audio\r\na=rtcp-mux\r\na=ice-ufrag:6Erz\r\na=ice-pwd:RT7PufgrzLp3jAREqbRfn2\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:111 opus/48000/2\r\na=ssrc:3772083770 cname:janusaudio\r\na=ssrc:3772083770 msid:janus janusa0\r\na=ssrc:3772083770 mslabel:janus\r\na=ssrc:3772083770 label:janusa0\r\na=candidate:1 1 udp 2013266431 34.210.153.174 39816 typ host\r\na=end-of-candidates\r\nm=video 9 UDP/TLS/RTP/SAVPF 96\r\nc=IN IP4 34.210.153.174\r\na=sendrecv\r\na=mid:video\r\na=rtcp-mux\r\na=ice-ufrag:6Erz\r\na=ice-pwd:RT7PufgrzLp3jAREqbRfn2\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=ssrc:1648006521 cname:janusvideo\r\na=ssrc:1648006521 msid:janus janusv0\r\na=ssrc:1648006521 mslabel:janus\r\na=ssrc:1648006521 label:janusv0\r\na=candidate:1 1 udp 2013266431 34.210.153.174 39816 typ host\r\na=end-of-candidates\r\nm=application 0 DTLS/SCTP 0\r\nc=IN IP4 34.210.153.174\r\na=inactive\r\n",
| |
| "remote": "v=0\r\no=- 3511923798191890868 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video data\r\na=msid-semantic: WMS lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 110 112 113 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:p+PZ\r\na=ice-pwd:D72E+7DA73ZpXav0OX6hqKc4\r\na=fingerprint:sha-256 63:80:36:6C:FF:B6:A3:90:EC:9E:A1:8F:88:55:BD:6B:BF:22:79:6D:7C:39:F5:28:95:30:20:D5:F4:72:CB:15\r\na=setup:actpass\r\na=mid:audio\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=sendrecv\r\na=rtcp-mux\r\na=rtpmap:111 opus/48000/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:103 ISAC/16000\r\na=rtpmap:104 ISAC/32000\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:106 CN/32000\r\na=rtpmap:105 CN/16000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:112 telephone-event/32000\r\na=rtpmap:113 telephone-event/16000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:3304842974 cname:RDB2zd7gCXXkfiVe\r\na=ssrc:3304842974 msid:lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y 86d1ad61-714c-4f3c-9925-aa05116dd348\r\na=ssrc:3304842974 mslabel:lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y\r\na=ssrc:3304842974 label:86d1ad61-714c-4f3c-9925-aa05116dd348\r\nm=video 9 UDP/TLS/RTP/SAVPF 96 98 100 102 127 97 99 101 125\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:p+PZ\r\na=ice-pwd:D72E+7DA73ZpXav0OX6hqKc4\r\na=fingerprint:sha-256 63:80:36:6C:FF:B6:A3:90:EC:9E:A1:8F:88:55:BD:6B:BF:22:79:6D:7C:39:F5:28:95:30:20:D5:F4:72:CB:15\r\na=setup:actpass\r\na=mid:video\r\na=extmap:2 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=sendrecv\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtpmap:98 VP9/90000\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtpmap:100 H264/90000\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=fmtp:100 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f\r\na=rtpmap:102 red/90000\r\na=rtpmap:127 ulpfec/90000\r\na=rtpmap:97 rtx/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:99 rtx/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:101 rtx/90000\r\na=fmtp:101 apt=100\r\na=rtpmap:125 rtx/90000\r\na=fmtp:125 apt=102\r\na=ssrc-group:FID 2063430829 1027334379\r\na=ssrc:2063430829 cname:RDB2zd7gCXXkfiVe\r\na=ssrc:2063430829 msid:lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y 66b9a8c4-e14d-4b31-ad79-c02819b2c179\r\na=ssrc:2063430829 mslabel:lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y\r\na=ssrc:2063430829 label:66b9a8c4-e14d-4b31-ad79-c02819b2c179\r\na=ssrc:1027334379 cname:RDB2zd7gCXXkfiVe\r\na=ssrc:1027334379 msid:lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y 66b9a8c4-e14d-4b31-ad79-c02819b2c179\r\na=ssrc:1027334379 mslabel:lm6ZXIoswvVlvDppEx8maepQnVoZ9lIut18Y\r\na=ssrc:1027334379 label:66b9a8c4-e14d-4b31-ad79-c02819b2c179\r\nm=application 9 DTLS/SCTP 5000\r\nc=IN IP4 0.0.0.0\r\na=ice-ufrag:p+PZ\r\na=ice-pwd:D72E+7DA73ZpXav0OX6hqKc4\r\na=fingerprint:sha-256 63:80:36:6C:FF:B6:A3:90:EC:9E:A1:8F:88:55:BD:6B:BF:22:79:6D:7C:39:F5:28:95:30:20:D5:F4:72:CB:15\r\na=setup:actpass\r\na=mid:data\r\na=sctpmap:5000 webrtc-datachannel 1024\r\n"
| |
| },
| |
| "queued-packets": -1,
| |
| "streams": [
| |
| {
| |
| "id": 1,
| |
| "ready": -1,
| |
| "ssrc": {
| |
| "audio": 3772083770,
| |
| "video": 1648006521,
| |
| "audio-peer": 3304842974,
| |
| "video-peer": 2063430829,
| |
| "video-peer-rtx": 1027334379
| |
| },
| |
| "direction": {
| |
| "audio-send": true,
| |
| "audio-recv": true,
| |
| "video-send": true,
| |
| "video-recv": true
| |
| },
| |
| "codecs": {
| |
| "audio-pt": 111,
| |
| "audio-codec": "opus",
| |
| "video-pt": 96,
| |
| "video-codec": "vp8"
| |
| },
| |
| "rtcp_stats": {
| |
| "audio": {
| |
| "base": 48000,
| |
| "rtt": 118,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 8,
| |
| "jitter-remote": 7,
| |
| "in-link-quality": 100,
| |
| "in-media-link-quality": 100,
| |
| "out-link-quality": 100,
| |
| "out-media-link-quality": 100
| |
| },
| |
| "video": {
| |
| "base": 90000,
| |
| "rtt": 106,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 12,
| |
| "jitter-remote": 34,
| |
| "in-link-quality": 100,
| |
| "in-media-link-quality": 100,
| |
| "out-link-quality": 100,
| |
| "out-media-link-quality": 100
| |
| }
| |
| },
| |
| "components": [
| |
| {
| |
| "id": 1,
| |
| "state": "ready",
| |
| "connected": 1668173348678,
| |
| "local-candidates": [
| |
| "1 1 udp 2013266431 34.210.153.174 39816 typ host"
| |
| ],
| |
| "remote-candidates": [
| |
| "2774440166 1 udp 1686052607 76.97.223.185 34677 typ srflx raddr 10.137.2.17 rport 34677 generation 0 ufrag p+PZ network-id 1 network-cost 50",
| |
| "201398067 1 udp 2122260223 10.137.2.17 34677 typ host generation 0 ufrag p+PZ network-id 1 network-cost 50"
| |
| ],
| |
| "selected-pair": "1 <-> 2774440166",
| |
| "dtls": {
| |
| "fingerprint": "D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38",
| |
| "remote-fingerprint": "63:80:36:6C:FF:B6:A3:90:EC:9E:A1:8F:88:55:BD:6B:BF:22:79:6D:7C:39:F5:28:95:30:20:D5:F4:72:CB:15",
| |
| "remote-fingerprint-hash": "sha-256",
| |
| "dtls-role": "active",
| |
| "dtls-state": "connected",
| |
| "retransmissions": 0,
| |
| "valid": true,
| |
| "ready": true,
| |
| "handshake-started": 1668173348679,
| |
| "connected": 1668173551000
| |
| },
| |
| "in_stats": {
| |
| "audio_packets": 569,
| |
| "audio_bytes": 49444,
| |
| "audio_bytes_lastsec": 4655,
| |
| "do_audio_nacks": false,
| |
| "video_packets": 522,
| |
| "video_bytes": 524820,
| |
| "video_bytes_lastsec": 69439,
| |
| "do_video_nacks": true,
| |
| "video_nacks": 0,
| |
| "data_packets": 3,
| |
| "data_bytes": 2252
| |
| },
| |
| "out_stats": {
| |
| "audio_packets": 569,
| |
| "audio_bytes": 49444,
| |
| "audio_bytes_lastsec": 4655,
| |
| "audio_nacks": 0,
| |
| "video_packets": 522,
| |
| "video_bytes": 524820,
| |
| "video_bytes_lastsec": 69439,
| |
| "video_nacks": 0,
| |
| "data_packets": 2,
| |
| "data_bytes": 1249
| |
| }
| |
| }
| |
| ]
| |
| }
| |
| ]
| |
| }
| |
| </pre>
| |
| # and here's how it should look with 1 person in a video room
| |
| <pre>
| |
| {
| |
| "session_id": 1402256714270032,
| |
| "session_last_activity": 1668108452194,
| |
| "session_transport": "janus.transport.http",
| |
| "handle_id": 6890846667155791,
| |
| "opaque_id": "videoroomtest-gQ0HkOODYlTn",
| |
| "created": 1668098788364,
| |
| "send_thread_created": true,
| |
| "current_time": 1668115288567,
| |
| "plugin": "janus.plugin.videoroom",
| |
| "plugin_specific": {
| |
| "type": "publisher",
| |
| "room": 1234,
| |
| "id": 8025939259403578,
| |
| "private_id": 3814460675,
| |
| "display": "tester",
| |
| "media": {
| |
| "audio": true,
| |
| "audio_codec": "opus",
| |
| "video": true,
| |
| "video_codec": "vp8",
| |
| "data": false
| |
| },
| |
| "bitrate": 128000,
| |
| "audio-level-dBov": 0,
| |
| "talking": false,
| |
| "hangingup": 0,
| |
| "destroyed": 0
| |
| },
| |
| "flags": {
| |
| "got-offer": true,
| |
| "got-answer": true,
| |
| "processing-offer": false,
| |
| "starting": true,
| |
| "ice-restart": false,
| |
| "ready": true,
| |
| "stopped": false,
| |
| "alert": false,
| |
| "trickle": true,
| |
| "all-trickles": true,
| |
| "resend-trickles": false,
| |
| "trickle-synced": false,
| |
| "data-channels": false,
| |
| "has-audio": true,
| |
| "has-video": true,
| |
| "rfc4588-rtx": false,
| |
| "cleaning": false
| |
| },
| |
| "agent-created": 1668105120869,
| |
| "ice-mode": "full",
| |
| "ice-role": "controlled",
| |
| "sdps": {
| |
| "profile": "UDP/TLS/RTP/SAVPF",
| |
| "local": "v=0\r\no=- 4889174595854780834 2 IN IP4 34.210.153.174\r\ns=VideoRoom 1234\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS janus\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111\r\nc=IN IP4 34.210.153.174\r\na=recvonly\r\na=mid:audio\r\na=rtcp-mux\r\na=ice-ufrag:Uzqo\r\na=ice-pwd:8HifUw7pi8Yo7sAjXkXBho\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:111 opus/48000/2\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=candidate:1 1 udp 2013266431 34.210.153.174 52952 typ host\r\na=end-of-candidates\r\nm=video 9 UDP/TLS/RTP/SAVPF 96\r\nc=IN IP4 34.210.153.174\r\na=recvonly\r\na=mid:video\r\na=rtcp-mux\r\na=ice-ufrag:Uzqo\r\na=ice-pwd:8HifUw7pi8Yo7sAjXkXBho\r\na=ice-options:trickle\r\na=fingerprint:sha-256 D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38\r\na=setup:active\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=candidate:1 1 udp 2013266431 34.210.153.174 52952 typ host\r\na=end-of-candidates\r\n",
| |
| "remote": "v=0\r\no=- 4889174595854780834 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE audio video\r\na=msid-semantic: WMS UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111 103 104 9 0 8 106 105 13 110 112 113 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:Zvcf\r\na=ice-pwd:+deoKfh3lfJijEJjllJSl+Uy\r\na=fingerprint:sha-256 12:1E:EE:EA:79:6C:10:0B:F1:CF:BA:36:9B:CA:06:2E:DD:9F:27:94:BE:59:D5:F3:41:33:ED:8C:B7:B8:BD:BE\r\na=setup:actpass\r\na=mid:audio\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=sendonly\r\na=rtcp-mux\r\na=rtpmap:111 opus/48000/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:103 ISAC/16000\r\na=rtpmap:104 ISAC/32000\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:106 CN/32000\r\na=rtpmap:105 CN/16000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:112 telephone-event/32000\r\na=rtpmap:113 telephone-event/16000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:1148435629 cname:GyI76iRrqiXyq78e\r\na=ssrc:1148435629 msid:UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G b57f5c0e-5053-4075-94af-be03835fbe73\r\na=ssrc:1148435629 mslabel:UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G\r\na=ssrc:1148435629 label:b57f5c0e-5053-4075-94af-be03835fbe73\r\nm=video 9 UDP/TLS/RTP/SAVPF 96 98 100 102 127 97 99 101 125\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:Zvcf\r\na=ice-pwd:+deoKfh3lfJijEJjllJSl+Uy\r\na=fingerprint:sha-256 12:1E:EE:EA:79:6C:10:0B:F1:CF:BA:36:9B:CA:06:2E:DD:9F:27:94:BE:59:D5:F3:41:33:ED:8C:B7:B8:BD:BE\r\na=setup:actpass\r\na=mid:video\r\na=extmap:2 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:4 urn:3gpp:video-orientation\r\na=extmap:5 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:6 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=sendonly\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 VP8/90000\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtpmap:98 VP9/90000\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtpmap:100 H264/90000\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=fmtp:100 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e01f\r\na=rtpmap:102 red/90000\r\na=rtpmap:127 ulpfec/90000\r\na=rtpmap:97 rtx/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:99 rtx/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:101 rtx/90000\r\na=fmtp:101 apt=100\r\na=rtpmap:125 rtx/90000\r\na=fmtp:125 apt=102\r\na=ssrc-group:FID 3236656850 3421542791\r\na=ssrc:3236656850 cname:GyI76iRrqiXyq78e\r\na=ssrc:3236656850 msid:UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G 2bcf83cc-4452-4e59-916b-ef827f43ddb7\r\na=ssrc:3236656850 mslabel:UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G\r\na=ssrc:3236656850 label:2bcf83cc-4452-4e59-916b-ef827f43ddb7\r\na=ssrc:3421542791 cname:GyI76iRrqiXyq78e\r\na=ssrc:3421542791 msid:UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G 2bcf83cc-4452-4e59-916b-ef827f43ddb7\r\na=ssrc:3421542791 mslabel:UcQkRoy2WPdTsECg49jOztMxnWrHKJX4fQ7G\r\na=ssrc:3421542791 label:2bcf83cc-4452-4e59-916b-ef827f43ddb7\r\n"
| |
| },
| |
| "queued-packets": -1,
| |
| "streams": [
| |
| {
| |
| "id": 1,
| |
| "ready": -1,
| |
| "ssrc": {
| |
| "audio": 2938984141,
| |
| "video": 942089758,
| |
| "audio-peer": 1148435629,
| |
| "video-peer": 3236656850,
| |
| "video-peer-rtx": 3421542791
| |
| },
| |
| "direction": {
| |
| "audio-send": false,
| |
| "audio-recv": true,
| |
| "video-send": false,
| |
| "video-recv": true
| |
| },
| |
| "codecs": {
| |
| "audio-pt": 111,
| |
| "audio-codec": "opus",
| |
| "video-pt": 96,
| |
| "video-codec": "vp8"
| |
| },
| |
| "rtcp_stats": {
| |
| "audio": {
| |
| "base": 48000,
| |
| "rtt": 0,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 3,
| |
| "jitter-remote": 0,
| |
| "in-link-quality": 100,
| |
| "in-media-link-quality": 100,
| |
| "out-link-quality": 0,
| |
| "out-media-link-quality": 0
| |
| },
| |
| "video": {
| |
| "base": 90000,
| |
| "rtt": 0,
| |
| "lost": 0,
| |
| "lost-by-remote": 0,
| |
| "jitter-local": 26,
| |
| "jitter-remote": 0,
| |
| "in-link-quality": 100,
| |
| "in-media-link-quality": 100,
| |
| "out-link-quality": 0,
| |
| "out-media-link-quality": 0
| |
| }
| |
| },
| |
| "components": [
| |
| {
| |
| "id": 1,
| |
| "state": "connected",
| |
| "connected": 1668107173208,
| |
| "local-candidates": [
| |
| "1 1 udp 2013266431 34.210.153.174 52952 typ host"
| |
| ],
| |
| "remote-candidates": [
| |
| "201398067 1 udp 2122260223 10.137.2.17 44959 typ host generation 0 ufrag Zvcf network-id 1 network-cost 50",
| |
| "2774440166 1 udp 1686052607 76.97.223.185 44959 typ srflx raddr 10.137.2.17 rport 44959 generation 0 ufrag Zvcf network-id 1 network-cost 50"
| |
| ],
| |
| "selected-pair": "1 <-> 2774440166",
| |
| "dtls": {
| |
| "fingerprint": "D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38",
| |
| "remote-fingerprint": "12:1E:EE:EA:79:6C:10:0B:F1:CF:BA:36:9B:CA:06:2E:DD:9F:27:94:BE:59:D5:F3:41:33:ED:8C:B7:B8:BD:BE",
| |
| "remote-fingerprint-hash": "sha-256",
| |
| "dtls-role": "active",
| |
| "dtls-state": "connected",
| |
| "retransmissions": 0,
| |
| "valid": true,
| |
| "ready": true,
| |
| "handshake-started": 1668107173210,
| |
| "connected": 1668107359694
| |
| },
| |
| "in_stats": {
| |
| "audio_packets": 396,
| |
| "audio_bytes": 37431,
| |
| "audio_bytes_lastsec": 4744,
| |
| "do_audio_nacks": false,
| |
| "video_packets": 160,
| |
| "video_bytes": 129569,
| |
| "video_bytes_lastsec": 17622,
| |
| "do_video_nacks": true,
| |
| "video_nacks": 0,
| |
| "data_packets": 3,
| |
| "data_bytes": 2250
| |
| },
| |
| "out_stats": {
| |
| "audio_packets": 0,
| |
| "audio_bytes": 0,
| |
| "audio_bytes_lastsec": 0,
| |
| "audio_nacks": 0,
| |
| "video_packets": 0,
| |
| "video_bytes": 0,
| |
| "video_bytes_lastsec": 0,
| |
| "video_nacks": 0,
| |
| "data_packets": 2,
| |
| "data_bytes": 1249
| |
| }
| |
| }
| |
| ]
| |
| }
| |
| ]
| |
| }
| |
| </pre>
| |
| # I added another laptop to the video room, and it appears to work fine--not sure if it's going through the server or just locally, though. Anyway, in this case there was 2x sessions each with 2x handles. One handle was marked as a type = subscriber and the other as a type=publisher.
| |
| | |
| =Tue May 08, 2018=
| |
| # whitelisted mod_security rule = '960024' to fix Forbidden message when Marcin commented on a post on osemain
| |
| | |
| =Mon May 07, 2018=
| |
| # logged time + updated Current Meeting
| |
| | |
| =Fri May 04, 2018=
| |
| # Marcin completed the first-draft of the wiki migration test plan last night
| |
| http://opensourceecology.org/wiki/Wiki_Validation
| |
| # I made somex edits & additions to the test plan
| |
| # I sent an email to Marcin asking for clarification on a few of the items of the test plan
| |
| # after Marcin responds, we'll go through the checklist on the staging site. If it passes, then we'll schedule a CHG date & time.
| |
| # In the meantime, I'll continue with the jangouts POC by trying to put the html/js demos vhost behind a quick self-signed https cert
| |
| <pre>
| |
| mkdir /etc/ssl/private/
| |
| sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout /etc/ssl/private/nginx-selfsigned.key -out /etc/ssl/certs/nginx-selfsigned.crt
| |
| vim /etc/nginx/conf.d/jangouts.opensourceecology.org
| |
| </pre>
| |
| # that eliminated the previous security error, but now I'm getting "Probably a network error, is the gateway down?: [object Object]" again
| |
| # the dev console shows "Using REST API to contact Janus: https://jangouts.opensourceecology.org:8089/janus / Failed to load resource: net::ERR_CONNECTION_TIMED_OUT" so probably I need to change the gateway to use https as well..
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus]# cp janus.transport.http.cfg janus.transport.http.cfg.orig
| |
| [root@ip-172-31-28-115 janus]# vim janus.transport.http.cfg
| |
| [root@ip-172-31-28-115 janus]# diff janus.transport.http.cfg.orig janus.transport.http.cfg
| |
| 22c22
| |
| < https = no ; Whether to enable HTTPS (default=no)
| |
| ---
| |
| > https = yes ; Whether to enable HTTPS (default=no)
| |
| [root@ip-172-31-28-115 janus]#
| |
| </pre>
| |
| # I also had to update the security group to permit 8089 in (it was 8088 before with http). After I did this, the error changed from a delayed timeout to an immediate "janus.js:74 OPTIONS https://jangouts.opensourceecology.org:8089/janus net::ERR_INSECURE_RESPONSE"
| |
| # this is because it's a self-signed cert on a different "server" (diferent port) than the one we created an exception for. The way around this is to just hit "https://jangouts.opensourceecology.org:8089/janus" in the browser once, approve the click exception (click "proceed to ...") and then try again
| |
| # now I can load all the demos, but nothing appears to actually work. for example, when I click the 'publish' button in the Video Room demo, it goes away after a few seconds
| |
| ## here's what I see in the browser's console
| |
| <pre>
| |
| janus.js:3064 isTrickleEnabled: undefined
| |
| janus.js:2987 isAudioSendEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| videoroomtest.js:109 Consent dialog should be on now
| |
| janus.js:2987 isAudioSendEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:3020 isVideoSendEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:2987 isAudioSendEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:3020 isVideoSendEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:2998 isAudioSendRequired: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:3031 isVideoSendRequired: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:2106 getUserMedia constraints Object {audio: true, video: false}
| |
| videoroomtest.js:109 Consent dialog should be off now
| |
| janus.js:1406 streamsDone: MediaStream {id: "r6H7MBqVvdSoPCKa3pvdKK9cevO4aGCEuKDu", active: true, onaddtrack: null, onremovetrack: null, onactive: null…}
| |
| janus.js:1408 -- Audio tracks: [MediaStreamTrack]
| |
| janus.js:1409 -- Video tracks: []
| |
| janus.js:1518 Creating PeerConnection
| |
| janus.js:1519 Object {optional: Array(1)}
| |
| janus.js:1521 RTCPeerConnection {remoteDescription: RTCSessionDescription, signalingState: "stable", iceGatheringState: "new", iceConnectionState: "new", onnegotiationneeded: null…}
| |
| janus.js:1526 Preparing local SDP and gathering candidates (trickle=true)
| |
| janus.js:1577 Adding local stream
| |
| janus.js:1579 Adding local track: MediaStreamTrack {kind: "audio", id: "f4d9b3bf-de5a-4b69-b2bf-f5135f2e7341", label: "Default", enabled: true, muted: false…}
| |
| janus.js:3053 isDataEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| videoroomtest.js:276 ::: Got a local stream :::
| |
| videoroomtest.js:278 MediaStream {id: "r6H7MBqVvdSoPCKa3pvdKK9cevO4aGCEuKDu", active: true, onaddtrack: null, onremovetrack: null, onactive: null…}
| |
| janus.js:2183 Creating offer (iceDone=false)
| |
| janus.js:3009 isAudioRecvEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:3042 isVideoRecvEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| janus.js:2282 Object {offerToReceiveAudio: false, offerToReceiveVideo: false}
| |
| janus.js:3020 isVideoSendEnabled: Object {audioRecv: false, videoRecv: false, audioSend: true, videoSend: true, update: false}
| |
| ↵"}9706 label:f4d9b3bf-de5a-4b69-b2bf-f5135f2e7341, sdp: "v=0
| |
| janus.js:2301 Setting local description
| |
| janus.js:2319 Offer ready
| |
| janus.js:2320 Object {media: Object, simulcast: false, success: function, error: function}
| |
| videoroomtest.js:402 Got publisher SDP!
| |
| ↵"}9706 label:f4d9b3bf-de5a-4b69-b2bf-f5135f2e7341v=0
| |
| janus.js:1132 Sending message to plugin (handle=1762304507750846):
| |
| janus.js:1133 Object {janus: "message", body: Object, transaction: "lG8BwROpEMYy", jsep: Object}
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "avyXGgAHSYaW"}
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "MS94i2gXf9wU"}
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "AnnWqBAjyOoe"}
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "SWyGgmyBz5ze"}
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "t3tlOjUVQ6Yc"}
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "NIY0ynxdRiwm"}
| |
| janus.js:1534 End of candidates.
| |
| janus.js:1229 Sending trickle candidate (handle=1762304507750846):
| |
| janus.js:1230 Object {janus: "trickle", candidate: Object, transaction: "4TjYcHlKlWs6"}
| |
| janus.js:1175 Message sent!
| |
| janus.js:1176 Object {janus: "ack", session_id: 7298240886377737, transaction: "lG8BwROpEMYy"}
| |
| janus.js:610 Got a plugin event on session 7298240886377737
| |
| janus.js:611 Object {janus: "event", session_id: 7298240886377737, transaction: "lG8BwROpEMYy", sender: 1762304507750846, plugindata: Object…}
| |
| janus.js:622 -- Event is coming from 1762304507750846 (janus.plugin.videoroom)
| |
| janus.js:624 Object {videoroom: "event", room: 1234, configured: "ok", audio_codec: "opus"}
| |
| janus.js:632 Handling SDP as well...
| |
| ↵"}end-of-candidates1864 2 IN IP4 172.31.28.11…2.31.28.115 40693 typ host
| |
| janus.js:637 Notifying application...
| |
| videoroomtest.js:149 ::: Got a message (publisher) :::
| |
| videoroomtest.js:150 Object {videoroom: "event", room: 1234, configured: "ok", audio_codec: "opus"}
| |
| videoroomtest.js:152 Event: event
| |
| videoroomtest.js:251 Handling SDP as well...
| |
| ↵"}end-of-candidates1864 2 IN IP4 172.31.28.11…2.31.28.115 40693 typ host
| |
| janus.js:2144 Remote description accepted!
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "avyXGgAHSYaW"}
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "MS94i2gXf9wU"}
| |
| janus.js:410 Long poll...
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "AnnWqBAjyOoe"}
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "SWyGgmyBz5ze"}
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "t3tlOjUVQ6Yc"}
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "NIY0ynxdRiwm"}
| |
| janus.js:1242 Candidate sent!
| |
| janus.js:1243 Object {janus: "ack", session_id: 7298240886377737, transaction: "4TjYcHlKlWs6"}
| |
| janus.js:535 Got a hangup event on session 7298240886377737
| |
| janus.js:536 Object {janus: "hangup", session_id: 7298240886377737, sender: 1762304507750846, reason: "ICE failed"}
| |
| videoroomtest.js:131 Janus says our WebRTC PeerConnection is down now
| |
| janus.js:2708 Cleaning WebRTC stuff
| |
| janus.js:2753 Stopping local stream tracks
| |
| janus.js:2757 MediaStreamTrack {kind: "audio", id: "f4d9b3bf-de5a-4b69-b2bf-f5135f2e7341", label: "Default", enabled: true, muted: false…}
| |
| videoroomtest.js:324 ::: Got a cleanup notification: we are unpublished now :::
| |
| janus.js:410 Long poll...
| |
| janus.js:455 Got a keepalive on session 7298240886377737
| |
| janus.js:410 Long poll...
| |
| </pre>
| |
| ## and here's what I see output to the terminal where janus is running (not as a daemon)
| |
| <pre>
| |
| [1762304507750846] Creating ICE agent (ICE Full mode, controlled)
| |
| [ERR] [sdp-utils.c:janus_sdp_get_codec_rtpmap:718] Unsupported codec 'none'
| |
| [WARN] [1762304507750846] ICE failed for component 1 in stream 1, but let's give it some time... (trickle received, answer received, alert not set)
| |
| [ERR] [ice.c:janus_ice_check_failed:1428] [1762304507750846] ICE failed for component 1 in stream 1...
| |
| [janus.plugin.videoroom-0x7fbf44003e30] No WebRTC media anymore; 0x7fbf44004aa0 0x7fbf44005780
| |
| [1762304507750846] WebRTC resources freed; 0x7fbf44004aa0 0x7fbf44004980
| |
| </pre>
| |
| # I tried the echo test demo from another machine that actually has a camera. It showed me, but not the echo. Here's what the janus output was on the server
| |
| <pre>
| |
| Creating new session: 607187205577013; 0x7fbf44003400
| |
| Creating new handle in session 607187205577013: 5995577680857670; 0x7fbf44003400 0x7fbf44007e30
| |
| [5995577680857670] Creating ICE agent (ICE Full mode, controlled)
| |
| [WARN] [5995577680857670] Skipping disabled/unsupported media line...
| |
| [WARN] [5995577680857670] Skipping disabled/unsupported media line...
| |
| [WARN] [5995577680857670] ICE failed for component 1 in stream 1, but let's give it some time... (trickle received, answer received, alert not set)
| |
| [ERR] [ice.c:janus_ice_check_failed:1428] [5995577680857670] ICE failed for component 1 in stream 1...
| |
| [janus.plugin.echotest-0x7fbf44007d00] No WebRTC media anymore
| |
| [5995577680857670] WebRTC resources freed; 0x7fbf44007e30 0x7fbf44003400
| |
| Destroying session 607187205577013; 0x7fbf44003400
| |
| Detaching handle from JANUS EchoTest plugin; 0x7fbf44007e30 0x7fbf44007d00 0x7fbf44007e30 0x7fbf4400fbb0
| |
| [5995577680857670] WebRTC resources freed; 0x7fbf44007e30 0x7fbf44003400
| |
| [5995577680857670] Handle and related resources freed; 0x7fbf44007e30 0x7fbf44003400
| |
| </pre>
| |
| # and here's the server-side output from running the recordplaytest demo
| |
| ## this came when just loading then recording, which appeared fine
| |
| <pre>
| |
| Creating new session: 5857587399713942; 0x7fbf44003400
| |
| Creating new handle in session 5857587399713942: 4667944641759999; 0x7fbf44003400 0x7fbf44003a00
| |
| [4667944641759999] Creating ICE agent (ICE Full mode, controlled)
| |
| [WARN] Audio codec: opus
| |
| [WARN] Video codec: vp8
| |
| [WARN] [4667944641759999] ICE failed for component 1 in stream 1, but let's give it some time... (trickle received, answer received, alert not set)
| |
| [ERR] [ice.c:janus_ice_check_failed:1428] [4667944641759999] ICE failed for component 1 in stream 1...
| |
| [janus.plugin.recordplay-0x7fbf4400e090] No WebRTC media anymore
| |
| File is 8 bytes: rec-6957767953715790-audio.mjr
| |
| Closed audio recording rec-6957767953715790-audio.mjr
| |
| File is 8 bytes: rec-6957767953715790-video.mjr
| |
| Closed video recording rec-6957767953715790-video.mjr
| |
| [4667944641759999] WebRTC resources freed; 0x7fbf44003a00 0x7fbf44003400
| |
| </pre>
| |
| ## and this came when finding the recording in the list & attempting to play it back. This part didn't work at all, and the browser said "Error opening recording files"
| |
| <pre>
| |
| [WARN] Error opening audio recording, trying to go on anyway
| |
| [WARN] Error opening video recording, trying to go on anyway
| |
| </pre>
| |
| # and when I attempted to load the videoroomtest from a computer that actually has a camera, I would see myself for a bit, but then it would disappear. I could click "Publish", and it would do the same show me, then disappear. Here's what the server showed as this happened:
| |
| <pre>
| |
| Creating new session: 18614750988061; 0x7fbf440028f0
| |
| Creating new handle in session 18614750988061: 3771387103782624; 0x7fbf440028f0 0x7fbf44003a00
| |
| [WARN] [3771387103782624] No stream, queueing this trickle as it got here before the SDP...
| |
| [3771387103782624] Creating ICE agent (ICE Full mode, controlled)
| |
| Timeout expired for session 6914064699623145...
| |
| Detaching handle from JANUS VideoCall plugin; 0x7fbf4400e0c0 0x7fbf44002100 0x7fbf4400e0c0 0x7fbf4400bc30
| |
| [janus.plugin.videocall-0x7fbf44002100] No WebRTC media anymore
| |
| [7781385739946092] WebRTC resources freed; 0x7fbf4400e0c0 0x7fbf44004390
| |
| [7781385739946092] Handle and related resources freed; 0x7fbf4400e0c0 0x7fbf44004390
| |
| Destroying session 6914064699623145; 0x7fbf44004390
| |
| [WARN] [3771387103782624] ICE failed for component 1 in stream 1, but let's give it some time... (trickle received, answer received, alert not set)
| |
| [ERR] [ice.c:janus_ice_check_failed:1428] [3771387103782624] ICE failed for component 1 in stream 1...
| |
| [janus.plugin.videoroom-0x7fbf4400e090] No WebRTC media anymore; 0x7fbf44003a00 0x7fbf44005780
| |
| [3771387103782624] WebRTC resources freed; 0x7fbf44003a00 0x7fbf440028f0
| |
| </pre>
| |
| # I did some reading about ICE and debugging janus
| |
| ## https://en.wikipedia.org/wiki/Interactive_Connectivity_Establishment
| |
| ## http://www.meetecho.com/blog/understanding-the-janus-admin-api/
| |
| # that pointed me to some great resources for debugging webrtc on the browser's end
| |
| ## in chrome, use chrome://webrtc-internals
| |
| ## in firefox, use about:webrtc
| |
| ## in either, this is a good site provided by janus (who apparently provides streaming services for the IETF) for testing the client https://selftest.conf.meetecho.com/test/
| |
| # after the above research, I bet the issue is that I haven't opened the necessary UDP ports to our janus server. I've only touched TCP stuff coming in, and these ICE/STUN/TURN components likely will need UDP opened-up
| |
| # also note that, even though we're using a cloud provider here, I am explicitly using the public ip address for our server, so only the client should be traversing NAT here
| |
| <pre>
| |
| user@personal:~$ dig jangouts.opensourceecology.org
| |
| | |
| ; <<>> DiG 9.9.5-9+deb8u15-Debian <<>> jangouts.opensourceecology.org
| |
| ;; global options: +cmd
| |
| ;; Got answer:
| |
| ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 43410
| |
| ;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1
| |
| | |
| ;; OPT PSEUDOSECTION:
| |
| ; EDNS: version: 0, flags:; udp: 4096
| |
| ;; QUESTION SECTION:
| |
| ;jangouts.opensourceecology.org. IN A
| |
| | |
| ;; ANSWER SECTION:
| |
| jangouts.opensourceecology.org. 300 IN A 34.210.153.174
| |
| | |
| ;; Query time: 2229 msec
| |
| ;; SERVER: 10.137.5.1#53(10.137.5.1)
| |
| ;; WHEN: Fri May 04 18:07:54 EDT 2018
| |
| ;; MSG SIZE rcvd: 75
| |
| | |
| user@personal:~$
| |
| </pre>
| |
| # it appears that janus is running on two udp ports = 5002 & 5004
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# ss -planu | grep -i janus
| |
| UNCONN 0 0 *:5002 *:* users:(("janus",pid=6070,fd=5))
| |
| UNCONN 0 0 *:5004 *:* users:(("janus",pid=6070,fd=6))
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| ## that didn't help
| |
| # I also spent some time researching getting qubes to forward my camera to a new 'conference' appvm so that I can do this testing better. I got it working in chromium, but not firefox
| |
| ## it appears to not be possible to actually enumerate all microphone & camera devices in firefox directly, but this page helps with that https://www.webrtc-experiment.com/demos/MediaStreamTrack.getSources.html
| |
| ## in chromium, you can use the above link or go to chome://settings > advanced > content settings. there you can see a drop-down showing both the available microphones & cameras.
| |
| # I got it working, so now I can fully test on my laptop running QubesOS
| |
| # I found a research paper analyzing the performance of Janus with 10 publishers + 90 subscribers in the VideoRoom plugin, which is very near to what we want to accomplish at OSE https://www.researchgate.net/publication/300727546_Performance_analysis_of_the_Janus_WebRTC_gateway
| |
| # I also stumbled on 2x more janus/jitis alternatives:
| |
| ## kurento https://doc-kurento.readthedocs.io/en/stable/
| |
| ## they provide a way to install in AWS CloudFormation and Ubuntu only https://doc-kurento.readthedocs.io/en/stable/user/installation.html#local-installation
| |
| ### this project has excellent documentation, and they stated that all udp ports must be open to run a STUN server.. 0-65535
| |
| ## licode http://lynckia.com/licode/
| |
| ### this can be installed from a docker image, which may be less (or more?) of a headache http://licode.readthedocs.io/en/stable/docker/
| |
| ## I still think that Jangouts is probably our best solution afaict
| |
| #### otherwise, they only support installs on ubuntu http://licode.readthedocs.io/en/stable/from_source/
| |
| # I enabled all UDP ports coming-in on the security group; this helped, but I'm still having issues
| |
| <pre>
| |
| Destroying session 33940990778352; 0x7f2f5802e800
| |
| Detaching handle from JANUS Record&Play plugin; 0x7f2f58013a70 0x7f2f58014030 0x7f2f58013a70 0x7f2f58014060
| |
| [ERR] [ice.c:janus_plugin_session_is_alive:401] Invalid plugin session (0x7f2f58014030)
| |
| [807900953279518] WebRTC resources freed; 0x7f2f58013a70 0x7f2f5802e800
| |
| [807900953279518] Handle and related resources freed; 0x7f2f58013a70 0x7f2f5802e800
| |
| Creating new session: 2711508791369859; 0x7f2f5802e800
| |
| Creating new handle in session 2711508791369859: 492301824865738; 0x7f2f5802e800 0x7f2f5802d880
| |
| [492301824865738] Creating ICE agent (ICE Full mode, controlled)
| |
| [WARN] [492301824865738] ICE failed for component 1 in stream 1, but let's give it some time... (trickle received, answer received, alert not set)
| |
| [ERR] [ice.c:janus_ice_check_failed:1428] [492301824865738] ICE failed for component 1 in stream 1...
| |
| [janus.plugin.videoroom-0x7f2f58014600] No WebRTC media anymore; 0x7f2f5802d880 0x7f2f58012f80
| |
| [492301824865738] WebRTC resources freed; 0x7f2f5802d880 0x7f2f5802e800
| |
| </pre>
| |
| # to get better debugging info, I should enable the Janus Admin API per this guide http://www.meetecho.com/blog/understanding-the-janus-admin-api/
| |
| | |
| =Thr May 03, 2018=
| |
| # began investigating a POC for jangouts, which uses the janus video gateway and the videoroom plugin as a self-hosted google-hangouts-like alternative https://janus.conf.meetecho.com/docs/README.html
| |
| # first configure attempt failed
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| checking for JANUS... no
| |
| configure: error: Package requirements (
| |
| glib-2.0 >= 2.34
| |
| nice
| |
| jansson >= 2.5
| |
| libssl >= 1.0.1
| |
| libcrypto
| |
| ) were not met:
| |
| | |
| No package 'glib-2.0' found
| |
| No package 'nice' found
| |
| No package 'jansson' found
| |
| | |
| Consider adjusting the PKG_CONFIG_PATH environment variable if you
| |
| installed software in a non-standard prefix.
| |
| | |
| Alternatively, you may set the environment variables JANUS_CFLAGS
| |
| and JANUS_LIBS to avoid the need to call pkg-config.
| |
| See the pkg-config man page for more details.
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # fixed the above issues by also installing some additional packages. some of them required enabling all repos explicitly
| |
| <pre>
| |
| yum install -y glibc2-devel
| |
| yum --enablerepo=* -y install libnice-devel
| |
| </pre>
| |
| # another error
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| configure: error: libsrtp and libsrtp2 not found. See README.md for installation instructions
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # and the attempted fix for libsrtp2, but it didn't help
| |
| <pre>
| |
| yum --enablerepo=* -y install libnice-devel
| |
| </pre>
| |
| # it requires a newer version (>=1.5.0) that what is in the repo (1.4.4)
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| checking for SRTP15X... no
| |
| configure: error: Package requirements (
| |
| libsrtp >= 1.5.0
| |
| ) were not met:
| |
| | |
| Requested 'libsrtp >= 1.5.0' but version of libsrtp is 1.4.4
| |
| You may find new versions of libsrtp at http://srtp.sourceforge.net
| |
| | |
| Consider adjusting the PKG_CONFIG_PATH environment variable if you
| |
| installed software in a non-standard prefix.
| |
| | |
| Alternatively, you may set the environment variables SRTP15X_CFLAGS
| |
| and SRTP15X_LIBS to avoid the need to call pkg-config.
| |
| See the pkg-config man page for more details.
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # tried a manual install
| |
| <pre>
| |
| # remove libsrtp from the repos as it's too old
| |
| yum remove -y libsrtp
| |
| | |
| # attempt to install from source
| |
| mkdir -p $HOME/src
| |
| pushd $HOME/src
| |
| wget https://github.com/cisco/libsrtp/archive/v1.5.4.tar.gz
| |
| tar xfv v1.5.4.tar.gz
| |
| cd libsrtp-1.5.4
| |
| ./configure --prefix=/usr --enable-openssl
| |
| make shared_library && sudo make install
| |
| </pre>
| |
| # that still failed; it was unable to install
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| checking for SRTP15X... no
| |
| configure: error: Package requirements (
| |
| libsrtp >= 1.5.0
| |
| ) were not met:
| |
| | |
| No package 'libsrtp' found
| |
| | |
| Consider adjusting the PKG_CONFIG_PATH environment variable if you
| |
| installed software in a non-standard prefix.
| |
| | |
| Alternatively, you may set the environment variables SRTP15X_CFLAGS
| |
| and SRTP15X_LIBS to avoid the need to call pkg-config.
| |
| See the pkg-config man page for more details.
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus --libdir=/usr/lib/
| |
| </pre>
| |
| # so I tried version 2.0, first uninstalling 1.5.4
| |
| <pre>
| |
| [root@ip-172-31-28-115 libsrtp-1.5.4]# make uninstall
| |
| rm -f /usr/include/srtp/*.h
| |
| rm -f /usr/lib/libsrtp.*
| |
| rmdir /usr/include/srtp
| |
| if [ "libsrtp.pc" != "" ]; then \
| |
| rm -f /usr/lib/pkgconfig/libsrtp.pc; \
| |
| fi
| |
| [root@ip-172-31-28-115 libsrtp-1.5.4]#
| |
| </pre>
| |
| <pre>
| |
| mkdir -p $HOME/src
| |
| pushd $HOME/src
| |
| wget https://github.com/cisco/libsrtp/archive/v2.0.0.tar.gz
| |
| tar xfv v2.0.0.tar.gz
| |
| cd libsrtp-2.0.0
| |
| ./configure --prefix=/usr --enable-openssl
| |
| make shared_library && sudo make install
| |
| </pre>
| |
| # that worked. next issue is lua-libs
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| checking for dlopen in -ldl... yes
| |
| checking for srtp_init in -lsrtp2... yes
| |
| checking srtp2/srtp.h usability... yes
| |
| checking srtp2/srtp.h presence... yes
| |
| checking for srtp2/srtp.h... yes
| |
| checking for srtp_crypto_policy_set_aes_gcm_256_16_auth in -lsrtp2... yes
| |
| checking for usrsctp_finish in -lusrsctp... no
| |
| checking for LIBCURL... yes
| |
| checking for doxygen... no
| |
| checking for dot... no
| |
| checking for gengetopt... yes
| |
| checking for TRANSPORTS... yes
| |
| checking for MHD... no
| |
| checking for lws_create_vhost in -lwebsockets... no
| |
| checking for amqp_error_string2 in -lrabbitmq... no
| |
| checking for MQTTAsync_create in -lpaho-mqtt3a... no
| |
| checking for PLUGINS... yes
| |
| checking for SOFIA... no
| |
| checking for LIBRE... no
| |
| checking for LIBRE... no
| |
| checking for OPUS... no
| |
| checking for OGG... no
| |
| checking for LUA... no
| |
| checking for LUA... no
| |
| configure: error: lua-libs not found. See README.md for installation instructions or use --disable-plugin-lua
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # fixed by installing from all repos again
| |
| <pre>
| |
| yum --enablerepo=* -y install lua-devel
| |
| </pre>
| |
| # and that worked!
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| config.status: executing libtool commands
| |
| | |
| libsrtp version: 2.x
| |
| SSL/crypto library: OpenSSL
| |
| DTLS set-timeout: not available
| |
| DataChannels support: no
| |
| Recordings post-processor: no
| |
| TURN REST API client: yes
| |
| Doxygen documentation: no
| |
| Transports:
| |
| REST (HTTP/HTTPS): no
| |
| WebSockets: no
| |
| RabbitMQ: no
| |
| MQTT: no
| |
| Unix Sockets: yes
| |
| Plugins:
| |
| Echo Test: yes
| |
| Streaming: yes
| |
| Video Call: yes
| |
| SIP Gateway (Sofia): no
| |
| SIP Gateway (libre): no
| |
| NoSIP (RTP Bridge): yes
| |
| Audio Bridge: no
| |
| Video Room: yes
| |
| Voice Mail: no
| |
| Record&Play: yes
| |
| Text Room: yes
| |
| Lua Interpreter: yes
| |
| Event handlers:
| |
| Sample event handler: yes
| |
| RabbitMQ event handler:no
| |
| JavaScript modules: no
| |
| | |
| If this configuration is ok for you, do a 'make' to start building Janus. A 'make install' will install Janus and its plugins to the specified prefix. Finally, a 'make configs' will install some sample configuration files too (something you'll only want to do the first time, though).
| |
| | |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # both make & make install worked too!
| |
| <pre>
| |
| libtool: finish: PATH="/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/sbin" ldconfig -n /opt/janus/lib/janus/plugins
| |
| ----------------------------------------------------------------------
| |
| Libraries have been installed in:
| |
| /opt/janus/lib/janus/plugins
| |
| | |
| If you ever happen to want to link against installed libraries
| |
| in a given directory, LIBDIR, you must either use libtool, and
| |
| specify the full pathname of the library, or use the `-LLIBDIR'
| |
| flag during linking and do at least one of the following:
| |
| - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
| |
| during execution
| |
| - add LIBDIR to the `LD_RUN_PATH' environment variable
| |
| during linking
| |
| - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
| |
| - have your system administrator add LIBDIR to `/etc/ld.so.conf'
| |
| | |
| See any operating system documentation about shared libraries for
| |
| more information, such as the ld(1) and ld.so(8) manual pages.
| |
| ----------------------------------------------------------------------
| |
| /bin/mkdir -p '/opt/janus/include/janus/plugins'
| |
| /bin/install -c -m 644 plugins/plugin.h '/opt/janus/include/janus/plugins'
| |
| /bin/mkdir -p '/opt/janus/share/janus/recordings'
| |
| /bin/install -c -m 644 plugins/recordings/1234.nfo plugins/recordings/rec-sample-audio.mjr plugins/recordings/rec-sample-video.mjr '/opt/janus/share/janus/recordings'
| |
| /bin/mkdir -p '/opt/janus/share/janus/streams'
| |
| /bin/install -c -m 644 plugins/streams/music.mulaw plugins/streams/radio.alaw plugins/streams/test_gstreamer.sh plugins/streams/test_gstreamer_1.sh '/opt/janus/share/janus/streams'
| |
| /bin/mkdir -p '/opt/janus/lib/janus/transports'
| |
| /bin/sh ./libtool --mode=install /bin/install -c transports/libjanus_pfunix.la '/opt/janus/lib/janus/transports'
| |
| libtool: install: /bin/install -c transports/.libs/libjanus_pfunix.so.0.0.0 /opt/janus/lib/janus/transports/libjanus_pfunix.so.0.0.0
| |
| libtool: install: (cd /opt/janus/lib/janus/transports && { ln -s -f libjanus_pfunix.so.0.0.0 libjanus_pfunix.so.0 || { rm -f libjanus_pfunix.so.0 && ln -s libjanus_pfunix.so.0.0.0 libjanus_pfunix.so.0; }; })
| |
| libtool: install: (cd /opt/janus/lib/janus/transports && { ln -s -f libjanus_pfunix.so.0.0.0 libjanus_pfunix.so || { rm -f libjanus_pfunix.so && ln -s libjanus_pfunix.so.0.0.0 libjanus_pfunix.so; }; })
| |
| libtool: install: /bin/install -c transports/.libs/libjanus_pfunix.lai /opt/janus/lib/janus/transports/libjanus_pfunix.la
| |
| libtool: finish: PATH="/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/sbin" ldconfig -n /opt/janus/lib/janus/transports
| |
| ----------------------------------------------------------------------
| |
| Libraries have been installed in:
| |
| /opt/janus/lib/janus/transports
| |
| | |
| If you ever happen to want to link against installed libraries
| |
| in a given directory, LIBDIR, you must either use libtool, and
| |
| specify the full pathname of the library, or use the `-LLIBDIR'
| |
| flag during linking and do at least one of the following:
| |
| - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
| |
| during execution
| |
| - add LIBDIR to the `LD_RUN_PATH' environment variable
| |
| during linking
| |
| - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
| |
| - have your system administrator add LIBDIR to `/etc/ld.so.conf'
| |
| | |
| See any operating system documentation about shared libraries for
| |
| more information, such as the ld(1) and ld.so(8) manual pages.
| |
| ----------------------------------------------------------------------
| |
| /bin/mkdir -p '/opt/janus/include/janus/transports'
| |
| /bin/install -c -m 644 transports/transport.h '/opt/janus/include/janus/transports'
| |
| make[3]: Leaving directory `/root/sandbox/janus-gateway'
| |
| make[2]: Leaving directory `/root/sandbox/janus-gateway'
| |
| make[1]: Leaving directory `/root/sandbox/janus-gateway'
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # but then it failed to start, citing libsrtp issues again
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# /opt/janus/bin/janus
| |
| /opt/janus/bin/janus: error while loading shared libraries: libsrtp2.so.1: cannot open shared object file: No such file or directory
| |
| [root@ip-172-31-28-115 janus-gateway]# /opt/janus/bin/janus --help
| |
| /opt/janus/bin/janus: error while loading shared libraries: libsrtp2.so.1: cannot open shared object file: No such file or directory
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # I tried again using the '--libdir=/usr/lib64' or '--libdir=/usr/lib' flag during configure, per the README's recommendations, but that wasn't particularly helpful
| |
| <pre>
| |
| make uninstall
| |
| make
| |
| make install
| |
| make configs
| |
| </pre>
| |
| # indeed, it's at "/usr/lib"
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# find /usr -name libsrtp2.so.1
| |
| /usr/lib/libsrtp2.so.1
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # I got it to run by setting the LD_LIBRARY_PATH per https://groups.google.com/forum/#!topic/meetecho-janus/fznCh3UYSCg
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# LD_LIBRARY_PATH=/usr/lib && /opt/janus/bin/janus --help
| |
| Janus commit: d8da250294cbdc193252ce059ef281ba0e2ff5bd
| |
| Compiled on: Thu May 3 22:03:54 UTC 2018
| |
| | |
| janus 0.4.0
| |
| | |
| Usage: janus [OPTIONS]...
| |
| | |
| -h, --help Print help and exit
| |
| -V, --version Print version and exit
| |
| -b, --daemon Launch Janus in background as a daemon
| |
| (default=off)
| |
| -p, --pid-file=path Open the specified PID file when starting Janus
| |
| (default=none)
| |
| -N, --disable-stdout Disable stdout based logging (default=off)
| |
| -L, --log-file=path Log to the specified file (default=stdout only)
| |
| -i, --interface=ipaddress Interface to use (will be the public IP)
| |
| -P, --plugins-folder=path Plugins folder (default=./plugins)
| |
| -C, --config=filename Configuration file to use
| |
| -F, --configs-folder=path Configuration files folder (default=./conf)
| |
| -c, --cert-pem=filename DTLS certificate
| |
| -k, --cert-key=filename DTLS certificate key
| |
| -K, --cert-pwd=text DTLS certificate key passphrase (if needed)
| |
| -S, --stun-server=ip:port STUN server(:port) to use, if needed (e.g.,
| |
| gateway behind NAT, default=none)
| |
| -1, --nat-1-1=ip Public IP to put in all host candidates,
| |
| assuming a 1:1 NAT is in place (e.g., Amazon
| |
| EC2 instances, default=none)
| |
| -E, --ice-enforce-list=list Comma-separated list of the only interfaces to
| |
| use for ICE gathering; partial strings are
| |
| supported (e.g., eth0 or eno1,wlan0,
| |
| default=none)
| |
| -X, --ice-ignore-list=list Comma-separated list of interfaces or IP
| |
| addresses to ignore for ICE gathering;
| |
| partial strings are supported (e.g.,
| |
| vmnet8,192.168.0.1,10.0.0.1 or
| |
| vmnet,192.168., default=vmnet)
| |
| -6, --ipv6-candidates Whether to enable IPv6 candidates or not
| |
| (experimental) (default=off)
| |
| -l, --libnice-debug Whether to enable libnice debugging or not
| |
| (default=off)
| |
| -f, --full-trickle Do full-trickle instead of half-trickle
| |
| (default=off)
| |
| -I, --ice-lite Whether to enable the ICE Lite mode or not
| |
| (default=off)
| |
| -T, --ice-tcp Whether to enable ICE-TCP or not (warning: only
| |
| works with ICE Lite) (default=off)
| |
| -R, --rfc-4588 Whether to enable RFC4588 retransmissions
| |
| support or not (default=off)
| |
| -q, --max-nack-queue=number Maximum size of the NACK queue (in ms) per user
| |
| for retransmissions
| |
| -t, --no-media-timer=number Time (in s) that should pass with no media
| |
| (audio or video) being received before Janus
| |
| notifies you about this
| |
| -r, --rtp-port-range=min-max Port range to use for RTP/RTCP
| |
| -n, --server-name=name Public name of this Janus instance
| |
| (default=MyJanusInstance)
| |
| -s, --session-timeout=number Session timeout value, in seconds (default=60)
| |
| -m, --reclaim-session-timeout=number
| |
| Reclaim session timeout value, in seconds
| |
| (default=0)
| |
| -d, --debug-level=1-7 Debug/logging level (0=disable debugging,
| |
| 7=maximum debug level; default=4)
| |
| -D, --debug-timestamps Enable debug/logging timestamps (default=off)
| |
| -o, --disable-colors Disable color in the logging (default=off)
| |
| -a, --apisecret=randomstring API secret all requests need to pass in order
| |
| to be accepted by Janus (useful when wrapping
| |
| Janus API requests in a server, none by
| |
| default)
| |
| -A, --token-auth Enable token-based authentication for all
| |
| requests (default=off)
| |
| --token-auth-secret=randomstring
| |
| Secret to verify HMAC-signed tokens with, to be
| |
| used with -A
| |
| -e, --event-handlers Enable event handlers (default=off)
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # attempts to start failed
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# LD_LIBRARY_PATH=/usr/lib && /opt/janus/bin/janus
| |
| Janus commit: d8da250294cbdc193252ce059ef281ba0e2ff5bd
| |
| Compiled on: Thu May 3 22:03:54 UTC 2018
| |
| | |
| ---------------------------------------------------
| |
| Starting Meetecho Janus (WebRTC Gateway) v0.4.0
| |
| ---------------------------------------------------
| |
| | |
| Checking command line arguments...
| |
| Debug/log level is 4
| |
| Debug/log timestamps are disabled
| |
| Debug/log colors are enabled
| |
| Adding 'vmnet' to the ICE ignore list...
| |
| Using 172.31.28.115 as local IP...
| |
| [WARN] Token based authentication disabled
| |
| Initializing recorder code
| |
| Initializing ICE stuff (Full mode, ICE-TCP candidates disabled, half-trickle, IPv6 support disabled)
| |
| TURN REST API backend: (disabled)
| |
| [WARN] Janus is deployed on a private address (172.31.28.115) but you didn't specify any STUN server! Expect trouble if this is supposed to work over the internet and not just in a LAN...
| |
| Crypto: OpenSSL pre-1.1.0
| |
| [WARN] The libsrtp installation does not support AES-GCM profiles
| |
| Fingerprint of our certificate: D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38
| |
| [WARN] Data Channels support not compiled
| |
| [WARN] Event handlers support disabled
| |
| Plugins folder: /opt/janus/lib/janus/plugins
| |
| Transport plugins folder: /opt/janus/lib/janus/transports
| |
| [FATAL] [janus.c:main:4209] No Janus API transport is available... enable at least one and restart Janus
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # I added an ldconfig config file
| |
| <pre># add lib dir
| |
| cat << EOF > /etc/ld.so.conf.d/janus.conf
| |
| /usr/lib
| |
| /opt/janus/lib/janus/plugins
| |
| EOF
| |
| ldconfig
| |
| </pre>
| |
| # and now I got a different error
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# /opt/janus/bin/janus
| |
| Janus commit: d8da250294cbdc193252ce059ef281ba0e2ff5bd
| |
| Compiled on: Thu May 3 22:14:30 UTC 2018
| |
| | |
| ---------------------------------------------------
| |
| Starting Meetecho Janus (WebRTC Gateway) v0.4.0
| |
| ---------------------------------------------------
| |
| | |
| Checking command line arguments...
| |
| Debug/log level is 4
| |
| Debug/log timestamps are disabled
| |
| Debug/log colors are enabled
| |
| Adding 'vmnet' to the ICE ignore list...
| |
| Using 172.31.28.115 as local IP...
| |
| [WARN] Token based authentication disabled
| |
| Initializing recorder code
| |
| Initializing ICE stuff (Full mode, ICE-TCP candidates disabled, half-trickle, IPv6 support disabled)
| |
| TURN REST API backend: (disabled)
| |
| [WARN] Janus is deployed on a private address (172.31.28.115) but you didn't specify any STUN server! Expect trouble if this is supposed to work over the internet and not just in a LAN...
| |
| Crypto: OpenSSL pre-1.1.0
| |
| [WARN] The libsrtp installation does not support AES-GCM profiles
| |
| Fingerprint of our certificate: D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38
| |
| [WARN] Data Channels support not compiled
| |
| [WARN] Event handlers support disabled
| |
| Plugins folder: /opt/janus/lib/janus/plugins
| |
| Loading plugin 'libjanus_echotest.so'...
| |
| JANUS EchoTest plugin initialized!
| |
| Loading plugin 'libjanus_recordplay.so'...
| |
| JANUS Record&Play plugin initialized!
| |
| Loading plugin 'libjanus_nosip.so'...
| |
| JANUS NoSIP plugin initialized!
| |
| Loading plugin 'libjanus_streaming.so'...
| |
| JANUS Streaming plugin initialized!
| |
| Loading plugin 'libjanus_videocall.so'...
| |
| JANUS VideoCall plugin initialized!
| |
| Loading plugin 'libjanus_videoroom.so'...
| |
| JANUS VideoRoom plugin initialized!
| |
| Loading plugin 'libjanus_textroom.so'...
| |
| JANUS TextRoom plugin initialized!
| |
| Loading plugin 'libjanus_lua.so'...
| |
| [ERR] [plugins/janus_lua.c:janus_lua_init:1136] Error loading Lua script /opt/janus/share/janus/lua/echotest.lua: /opt/janus/share/janus/lua/echotest.lua:6: module 'json' not found:
| |
| no field package.preload['json']
| |
| no file './json.lua'
| |
| no file '/usr/share/lua/5.1/json.lua'
| |
| no file '/usr/share/lua/5.1/json/init.lua'
| |
| no file '/usr/lib64/lua/5.1/json.lua'
| |
| no file '/usr/lib64/lua/5.1/json/init.lua'
| |
| no file '/opt/janus/share/janus/lua/json.lua'
| |
| no file './json.so'
| |
| no file '/usr/lib64/lua/5.1/json.so'
| |
| no file '/usr/lib64/lua/5.1/loadall.so'
| |
| [WARN] The 'janus.plugin.lua' plugin could not be initialized
| |
| Transport plugins folder: /opt/janus/lib/janus/transports
| |
| Loading transport plugin 'libjanus_pfunix.so'...
| |
| [WARN] Unix Sockets server disabled (Janus API)
| |
| [WARN] Unix Sockets server disabled (Admin API)
| |
| [WARN] No Unix Sockets server started, giving up...
| |
| [WARN] The 'janus.transport.pfunix' plugin could not be initialized
| |
| [FATAL] [janus.c:main:4209] No Janus API transport is available... enable at least one and restart Janus
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # changed the janus.transport.pfunix.cfg config file to enable it
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus]# cd /opt/janus/etc/janus
| |
| [root@ip-172-31-28-115 janus]# cp janus.transport.pfunix.cfg janus.transport.pfunix.cfg.orig
| |
| [root@ip-172-31-28-115 janus]# vim janus.transport.pfunix.cfg
| |
| [root@ip-172-31-28-115 janus]# diff janus.transport.pfunix.cfg janus.transport.pfunix.cfg.orig
| |
| 6c6
| |
| < enabled = yes ; Whether to enable the Unix Sockets interface
| |
| ---
| |
| > enabled = no ; Whether to enable the Unix Sockets interface
| |
| [root@ip-172-31-28-115 janus]#
| |
| </pre>
| |
| # that changed the error to complaining that no path was configured..
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# /opt/janus/bin/janus
| |
| ...
| |
| Loading transport plugin 'libjanus_pfunix.so'...
| |
| [WARN] No path configured, skipping Unix Sockets server (Janus API)
| |
| [WARN] Unix Sockets server disabled (Admin API)
| |
| [WARN] No Unix Sockets server started, giving up...
| |
| [WARN] The 'janus.transport.pfunix' plugin could not be initialized
| |
| [FATAL] [janus.c:main:4209] No Janus API transport is available... enable at least one and restart Janus
| |
| [root@ip-172-31-28-115 janus]#
| |
| </pre>
| |
| # set the path in the same config file
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus]# cd /opt/janus/etc/janus
| |
| [root@ip-172-31-28-115 janus]# cp janus.transport.pfunix.cfg janus.transport.pfunix.cfg.orig
| |
| [root@ip-172-31-28-115 janus]# vim janus.transport.pfunix.cfg
| |
| [root@ip-172-31-28-115 janus]# diff janus.transport.pfunix.cfg janus.transport.pfunix.cfg.orig
| |
| 6c6
| |
| < enabled = yes ; Whether to enable the Unix Sockets interface
| |
| ---
| |
| > enabled = no ; Whether to enable the Unix Sockets interface
| |
| 11d10
| |
| < path = /opt/janus/lib/janus/ux-janusapi.sock
| |
| [root@ip-172-31-28-115 janus]#
| |
| </pre>
| |
| # created an entry for jangouts.opensourceecology.org pointing to the ec2 dev instance
| |
| # I created a vhost in nginx and added the files in the html dir of the janus-gateway github into the new vhost's docroot https://github.com/meetecho/janus-gateway/tree/master/html
| |
| # without any further configuration, I got a network error "Probably a network error, is the gateway down?: [object Object]"
| |
| # this is all client-side code. so for the echo test demo, we'd configure the javascript file = echotest.js
| |
| # that's not going to be able to reach the janus api running on a unix socket; I probably need to make that public somehow, probably by binding nginx to the socket?
| |
| ## this is described here https://janus.conf.meetecho.com/docs/rest.html#plainhttp
| |
| # after reading through the docs, there appears to be a lot of pluggable transports for the janus api. lots of things suggest that the http rest api is the default, but apparently that didn't get installed on my system. Digging into the code suggests that this http transport depends on libmicrohttpd https://github.com/meetecho/janus-gateway/blob/master/transports/janus_http.c
| |
| # I installed libmicrohttpd & reinstalled janus
| |
| <pre>
| |
| yum install -y libmicrohttpd
| |
| sh autogen.sh
| |
| ./configure --prefix=/opt/janus
| |
| make
| |
| make install
| |
| make configs
| |
| </pre>
| |
| # that didn't work, and the configure message showed it
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus
| |
| ...
| |
| config.status: executing libtool commands
| |
| | |
| libsrtp version: 2.x
| |
| SSL/crypto library: OpenSSL
| |
| DTLS set-timeout: not available
| |
| DataChannels support: no
| |
| Recordings post-processor: no
| |
| TURN REST API client: yes
| |
| Doxygen documentation: no
| |
| Transports:
| |
| REST (HTTP/HTTPS): no
| |
| WebSockets: no
| |
| RabbitMQ: no
| |
| MQTT: no
| |
| Unix Sockets: yes
| |
| Plugins:
| |
| Echo Test: yes
| |
| Streaming: yes
| |
| Video Call: yes
| |
| SIP Gateway (Sofia): no
| |
| SIP Gateway (libre): no
| |
| NoSIP (RTP Bridge): yes
| |
| Audio Bridge: no
| |
| Video Room: yes
| |
| Voice Mail: no
| |
| Record&Play: yes
| |
| Text Room: yes
| |
| Lua Interpreter: yes
| |
| Event handlers:
| |
| Sample event handler: yes
| |
| RabbitMQ event handler:no
| |
| JavaScript modules: no
| |
| | |
| If this configuration is ok for you, do a 'make' to start building Janus. A 'make install' will install Janus and its plugins to the specified prefix. Finally, a 'make configs' will install some sample configuration files too (something you'll only want to do the first time, though).
| |
| | |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| ...
| |
| </pre>
| |
| ## under the "Transports" section, only "Unix Sockets" is "yes'. We probably need "REST (HTTP/HTTPS)" to be "yes" too
| |
| # I dug through the 'configure' file & discovered the option is called '--enable-rest', but executing that says that libmicrohttpd was not found, even though it's there!
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus --enable-rest
| |
| ...
| |
| checking for TRANSPORTS... yes
| |
| checking for MHD... no
| |
| configure: error: libmicrohttpd not found. See README.md for installation instructions or use --disable-rest
| |
| [root@ip-172-31-28-115 janus-gateway]# rpm -qa | grep -i libmicrohttpd
| |
| libmicrohttpd-0.9.33-2.el7.x86_64
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # I fixed this by installing the libmicrohttpd-devel package
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# yum --enablerepo=* -y search libmicrohttpd
| |
| Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
| |
| N/S matched: libmicrohttpd
| |
| libmicrohttpd-debuginfo.i686 : Debug information for package libmicrohttpd
| |
| libmicrohttpd-debuginfo.x86_64 : Debug information for package libmicrohttpd
| |
| libmicrohttpd-devel.i686 : Development files for libmicrohttpd
| |
| libmicrohttpd-devel.x86_64 : Development files for libmicrohttpd
| |
| libmicrohttpd-doc.noarch : Documentation for libmicrohttpd
| |
| libmicrohttpd.i686 : Lightweight library for embedding a webserver in applications
| |
| libmicrohttpd.x86_64 : Lightweight library for embedding a webserver in applications
| |
| | |
| Name and summary matches only, use "search all" for everything.
| |
| [root@ip-172-31-28-115 htdocs]# yum --enablerepo=* -y install libmicrohttpd-devel
| |
| Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
| |
| Resolving Dependencies
| |
| --> Running transaction check
| |
| ---> Package libmicrohttpd-devel.x86_64 0:0.9.33-2.el7 will be installed
| |
| --> Finished Dependency Resolution
| |
| | |
| Dependencies Resolved
| |
| | |
| ======================================================================================================================================================================
| |
| Package Arch Version Repository Size
| |
| ======================================================================================================================================================================
| |
| Installing:
| |
| libmicrohttpd-devel x86_64 0.9.33-2.el7 rhui-REGION-rhel-server-optional 28 k
| |
| | |
| Transaction Summary
| |
| ======================================================================================================================================================================
| |
| Install 1 Package
| |
| | |
| Total download size: 28 k
| |
| Installed size: 74 k
| |
| Downloading packages:
| |
| libmicrohttpd-devel-0.9.33-2.el7.x86_64.rpm | 28 kB 00:00:00
| |
| Running transaction check
| |
| Running transaction test
| |
| Transaction test succeeded
| |
| Running transaction
| |
| Installing : libmicrohttpd-devel-0.9.33-2.el7.x86_64 1/1
| |
| Verifying : libmicrohttpd-devel-0.9.33-2.el7.x86_64 1/1
| |
| | |
| Installed:
| |
| libmicrohttpd-devel.x86_64 0:0.9.33-2.el7
| |
| | |
| Complete!
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # and re-configured + installed
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# ./configure --prefix=/opt/janus --enable-rest
| |
| ...
| |
| checking for EVENTS... yes
| |
| checking for npm... /bin/npm
| |
| checking that generated files are newer than configure... done
| |
| configure: creating ./config.status
| |
| config.status: creating Makefile
| |
| config.status: creating html/Makefile
| |
| config.status: creating docs/Makefile
| |
| config.status: executing depfiles commands
| |
| config.status: executing libtool commands
| |
| | |
| libsrtp version: 2.x
| |
| SSL/crypto library: OpenSSL
| |
| DTLS set-timeout: not available
| |
| DataChannels support: no
| |
| Recordings post-processor: no
| |
| TURN REST API client: yes
| |
| Doxygen documentation: no
| |
| Transports:
| |
| REST (HTTP/HTTPS): yes
| |
| WebSockets: no
| |
| RabbitMQ: no
| |
| MQTT: no
| |
| Unix Sockets: yes
| |
| Plugins:
| |
| Echo Test: yes
| |
| Streaming: yes
| |
| Video Call: yes
| |
| SIP Gateway (Sofia): no
| |
| SIP Gateway (libre): no
| |
| NoSIP (RTP Bridge): yes
| |
| Audio Bridge: no
| |
| Video Room: yes
| |
| Voice Mail: no
| |
| Record&Play: yes
| |
| Text Room: yes
| |
| Lua Interpreter: yes
| |
| Event handlers:
| |
| Sample event handler: yes
| |
| RabbitMQ event handler:no
| |
| JavaScript modules: no
| |
| | |
| If this configuration is ok for you, do a 'make' to start building Janus. A 'make install' will install Janus and its plugins to the specified prefix. Finally, a 'make configs' will install some sample configuration files too (something you'll only want to do the first time, though).
| |
| | |
| [root@ip-172-31-28-115 janus-gateway]# make && make install && make configs
| |
| ...
| |
| [root@ip-172-31-28-115 janus-gateway]#
| |
| </pre>
| |
| # and this time it started with the rest over http transport instead of the socket
| |
| <pre>
| |
| [root@ip-172-31-28-115 janus-gateway]# /opt/janus/bin/janus
| |
| Janus commit: d8da250294cbdc193252ce059ef281ba0e2ff5bd
| |
| Compiled on: Fri May 4 00:11:11 UTC 2018
| |
| | |
| ---------------------------------------------------
| |
| Starting Meetecho Janus (WebRTC Gateway) v0.4.0
| |
| ---------------------------------------------------
| |
| | |
| Checking command line arguments...
| |
| Debug/log level is 4
| |
| Debug/log timestamps are disabled
| |
| Debug/log colors are enabled
| |
| Adding 'vmnet' to the ICE ignore list...
| |
| Using 172.31.28.115 as local IP...
| |
| [WARN] Token based authentication disabled
| |
| Initializing recorder code
| |
| Initializing ICE stuff (Full mode, ICE-TCP candidates disabled, half-trickle, IPv6 support disabled)
| |
| TURN REST API backend: (disabled)
| |
| [WARN] Janus is deployed on a private address (172.31.28.115) but you didn't specify any STUN server! Expect trouble if this is supposed to work over the internet and not just in a LAN...
| |
| Crypto: OpenSSL pre-1.1.0
| |
| [WARN] The libsrtp installation does not support AES-GCM profiles
| |
| Fingerprint of our certificate: D2:B9:31:8F:DF:24:D8:0E:ED:D2:EF:25:9E:AF:6F:B8:34:AE:53:9C:E6:F3:8F:F2:64:15:FA:E8:7F:53:2D:38
| |
| [WARN] Data Channels support not compiled
| |
| [WARN] Event handlers support disabled
| |
| Plugins folder: /opt/janus/lib/janus/plugins
| |
| Loading plugin 'libjanus_echotest.so'...
| |
| JANUS EchoTest plugin initialized!
| |
| Loading plugin 'libjanus_recordplay.so'...
| |
| JANUS Record&Play plugin initialized!
| |
| Loading plugin 'libjanus_nosip.so'...
| |
| JANUS NoSIP plugin initialized!
| |
| Loading plugin 'libjanus_streaming.so'...
| |
| JANUS Streaming plugin initialized!
| |
| Loading plugin 'libjanus_videocall.so'...
| |
| JANUS VideoCall plugin initialized!
| |
| Loading plugin 'libjanus_videoroom.so'...
| |
| JANUS VideoRoom plugin initialized!
| |
| Loading plugin 'libjanus_textroom.so'...
| |
| JANUS TextRoom plugin initialized!
| |
| Loading plugin 'libjanus_lua.so'...
| |
| [ERR] [plugins/janus_lua.c:janus_lua_init:1136] Error loading Lua script /opt/janus/share/janus/lua/echotest.lua: /opt/janus/share/janus/lua/echotest.lua:6: module 'json' not found:
| |
| no field package.preload['json']
| |
| no file './json.lua'
| |
| no file '/usr/share/lua/5.1/json.lua'
| |
| no file '/usr/share/lua/5.1/json/init.lua'
| |
| no file '/usr/lib64/lua/5.1/json.lua'
| |
| no file '/usr/lib64/lua/5.1/json/init.lua'
| |
| no file '/opt/janus/share/janus/lua/json.lua'
| |
| no file './json.so'
| |
| no file '/usr/lib64/lua/5.1/json.so'
| |
| no file '/usr/lib64/lua/5.1/loadall.so'
| |
| [WARN] The 'janus.plugin.lua' plugin could not be initialized
| |
| Transport plugins folder: /opt/janus/lib/janus/transports
| |
| Loading transport plugin 'libjanus_http.so'...
| |
| Joining Janus requests handler thread
| |
| Sessions watchdog started
| |
| HTTP webserver started (port 8088, /janus path listener)...
| |
| [WARN] HTTPS webserver disabled
| |
| [WARN] Admin/monitor HTTP webserver disabled
| |
| [WARN] Admin/monitor HTTPS webserver disabled
| |
| JANUS REST (HTTP/HTTPS) transport plugin initialized!
| |
| Loading transport plugin 'libjanus_pfunix.so'...
| |
| [WARN] Unix Sockets server disabled (Janus API)
| |
| [WARN] Unix Sockets server disabled (Admin API)
| |
| [WARN] No Unix Sockets server started, giving up...
| |
| [WARN] The 'janus.transport.pfunix' plugin could not be initialized
| |
| </pre>
| |
| # I still got an error on the demo page ("Probably a network error, is the gateway down?: [object Object]"), but that's probably an aws security group issue; it does appear to be listening over http, finally
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# ss -plan | grep -i janus
| |
| u_str ESTAB 0 0 * 900321 * 900320 users:(("janus",pid=25589,fd=13))
| |
| u_str ESTAB 0 0 * 900320 * 900321 users:(("janus",pid=25589,fd=12))
| |
| udp UNCONN 0 0 *:5002 *:* users:(("janus",pid=25589,fd=5))
| |
| udp UNCONN 0 0 *:5004 *:* users:(("janus",pid=25589,fd=6))
| |
| tcp LISTEN 0 32 :::8088 :::* users:(("janus",pid=25589,fd=11))
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # I created a new security group called 'videoconf-dev' that has inbound ports opened for 22, 443, 80, & 8088. I assigned this sec group to the dev node in ec2 in addition to default
| |
| # that helped, now chrome is yelling at me that it doesn't want to initiate a would-be secure webrtc connection over an insecure http line. it makes sense, but I wish it could be overriden for this test
| |
| <pre>
| |
| WebRTC error... {"name":"NotSupportedError","message":"Only secure origins are allowed (see: https://goo.gl/Y0ZkNV)."}
| |
| </pre>
| |
| ## this is supposed to be possible with the aptly named --unsafely-treat-insecure-origin-as-secure flag https://sites.google.com/a/chromium.org/dev/Home/chromium-security/deprecating-powerful-features-on-insecure-origins
| |
| | |
| ################################
| |
| # documented the install guide for janus
| |
| <pre>
| |
| # install epel
| |
| yum -y install epel-release
| |
| | |
| # install other depends, per the documentation
| |
| yum -y install libmicrohttpd-devel jansson-devel libnice-devel openssl-devel libsrtp-devel sofia-sip-devel glib-devel opus-devel libogg-devel libcurl-devel lua-devel pkgconfig gengetopt libtool autoconf automake
| |
| | |
| # install other depends, per my discovery of their necessity
| |
| yum -y install glibc2-devel
| |
| yum --enablerepo=* -y install libnice-devel jansson-devel lua-devel
| |
| | |
| # add lib dir
| |
| cat << EOF > /etc/ld.so.conf.d/janus.conf
| |
| /usr/lib
| |
| /opt/janus/lib/janus/plugins
| |
| EOF
| |
| ldconfig
| |
| | |
| # get & compile janus gateway
| |
| mkdir -p $HOME/sandbox
| |
| pushd $HOME/sandbox
| |
| git clone https://github.com/meetecho/janus-gateway.git
| |
| cd janus-gateway
| |
| sh autogen.sh
| |
| | |
| ./configure --prefix=/opt/janus
| |
| make
| |
| make install
| |
| make configs
| |
| </pre>
| |
| | |
| =Tue May 01, 2018=
| |
| # Marcin pointed out that our prod wiki doesn't allow new users to register because the recaptcha api is refusing to generate content with the error "reCAPTCHA V1 IS SHUTDOWN / Direct site owners to g.co/recaptcha/upgrade" http://opensourceecology.org/w/index.php?title=Special:RequestAccount
| |
| # Before I make changes to the prod site, I first confirmed that last night's backup completed successfully
| |
| <pre>
| |
| hancock% du -sh hetzner1/*
| |
| 0 hetzner1/20180426-052001
| |
| 12G hetzner1/20180427-052001
| |
| 12G hetzner1/20180428-052001
| |
| 12G hetzner1/20180429-052001
| |
| 12G hetzner1/20180430-052001
| |
| 12G hetzner1/20180501-052002
| |
| hancock%
| |
| </pre>
| |
| # I found our existing recaptcha keys at $wgReCaptchaPublicKey and $wgReCaptchaPrivateKey in hetzner1:/usr/www/users/soemain/w/LocalSettins.php
| |
| # I tried changing $wgCaptchaClass from 'ReCaptcha' to 'MathCaptcha', but it immediately caused an error
| |
| # it's not obvious which is our current account with our recaptcha credentials, so I just created a case-specific account for this & stored its credentials in keepass = recaptcha@opensourceecology.org
| |
| # I created a new key pair for ReCaptcha v2, but simply dropping it in made no changes. I dug deeper, and I found this note in the ConfirmEdit extension wiki page https://www.mediawiki.org/wiki/Extension:ConfirmEdit#ReCaptcha
| |
| <blockquote>
| |
| As noted in the ReCaptcha FAQ, Google does not support the ReCaptcha version 1 anymore, on which this CAPTCHA module depends. You should consider upgrading to version 2 of ReCaptcha (see the ReCaptchaNoCaptcha module). This module will be removed from ConfirmEdit in the near future (see task T142133).
| |
| </blockquote>
| |
| # That page also lists other captcha options: SimpleCaptcha, FancyCaptcha, MathCaptcha, QuestyCaptcha, ReCaptcha, RecaptchaNoCaptcha
| |
| ## I earlier confirmed that MathCaptcha fails
| |
| ## I confirmed that SimpleCaptcha works
| |
| ## I confirmed that Fancy Captcha fails
| |
| <pre>
| |
| [Tue May 01 15:39:44.432265 2018] [:error] [pid 10877] [client 127.0.0.1:41864] PHP Fatal error: Class 'FancyCaptcha' not found in /var/www/html/wiki.opensourceecology.org/htdocs/extensions/ConfirmAccount/frontend/specialpages/actions/RequestAccount_body.php on line 248
| |
| </pre>
| |
| ## I confirmed that QuestyCaptcha fails
| |
| <pre>
| |
| [Tue May 01 15:40:46.098136 2018] [:error] [pid 8193] [client 127.0.0.1:41984] PHP Fatal error: Class 'QuestyCaptcha' not found in /var/www/html/wiki.opensourceecology.org/htdocs/extensions/ConfirmAccount/frontend/specialpages/actions/RequestAccount_body.php on line 248
| |
| </pre>
| |
| # the above failures make sense, per the documentation
| |
| <blockquote>
| |
| Some of these modules require additional setup work:
| |
| | |
| MathCaptcha requires both the presence of TeX and, for versions of MediaWiki after 1.17, the Math extension;
| |
| FancyCaptcha requires running a preliminary setup script in Python;
| |
| and reCAPTCHA requires obtaining API keys.
| |
| </blockquote>
| |
| # I don't want to break the prod wiki, and this should really be done *after* the migration, so for now I'll just set the prod site to use SimpleCaptcha. After the migration, I'd like to look into using a question (ie: what is the opposite of open source?) + a locally-generated captcha (FancyCaptcha). Personally, I've had recaptcha break on me many times in the past, and I'd rather not depend on an external service for this.
| |
| | |
| =Thr Apr 26, 2018=
| |
| # Meeting with Marcin
| |
| ## Wiki validation so far
| |
| ### 2 outstanding issues
| |
| #### We have a superfluous link at the top-right = Create Account and Request Account we should delete the "Create Account" link
| |
| #### Marcin can't login because his old password was <20 characters, and the new wiki has a minimum password requirement for Administrators to have >=20 character passwords
| |
| ####
| |
| ## Test Plan
| |
| ### marcin will make a first-draft of a goolgle doc with an exhaustive list of things to test on the wiki (50-100-ish items) & send it to me by next week. it should include
| |
| #### Both simple & complex daily or routine wiki tasks
| |
| #### OSE-specific workflow tasks on the wiki that have broken in the past per your experiences that pre-date me
| |
| #### other wiki functions that we discovered were broken in the past few months when validating the staging wiki
| |
| ### we may migrate the wiki together in-person when I visit FeF for a few days in Mid-May
| |
| # Jitsi
| |
| ## I had some questions about this (see below), but we didn't get to it (we focused mostly on the wiki & backups)
| |
| ### what are the reqs for September? Same as before?
| |
| ### thoughts on hipchat?
| |
| ### maybe rockchat.
| |
| ### or maybe jangouts powered by janus (C > Node)
| |
| # I reset Marcin's password on the wiki to be >=20 characters using 'htdocs/maintenance/changePassword.php'
| |
| <pre>
| |
| [root@hetzner2 maintenance]# php changePassword.php --user=Marcin --password='fake567890example890'
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 715
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 674
| |
| PHP Notice: Undefined index: HTTP_USER_AGENT in /var/www/html/wiki.opensourceecology.org/LocalSettings.php on line 5
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 715
| |
| PHP Notice: Undefined index: SERVER_NAME in /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php on line 1507
| |
| PHP Notice: Undefined index: SERVER_NAME in /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php on line 1507
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php on line 693
| |
| Password set for Marcin
| |
| [root@hetzner2 maintenance]#
| |
| </pre>
| |
| | |
| =Wed Apr 25, 2018=
| |
| # atlassian got back to me giving me a free Hipchat Community License
| |
| # it's not well advertised, but I managed to find that the limit (as of 2018-01) the limit of participants in a single Stride call are 25 participants https://community.atlassian.com/t5/Stride-questions/How-many-participants-can-join-a-stride-video-call/qaq-p/694918
| |
| # while Hipchat has a limit of 20 participants https://confluence.atlassian.com/hipchat/video-chat-with-your-team-838548935.html
| |
| # updated our Videoconferencing article to include Rocket Chat and Hipchat
| |
| | |
| =Tue Apr 24, 2018=
| |
| # I sent an email to Atlassian requesting a community license request for Stride https://www.atlassian.com/software/views/community-license-request
| |
| # Atlassian automatically mailed me a service desk ticket for this request https://getsupport.atlassian.com/servicedesk/customer/portal/35/CA-452135
| |
| # I found a great article describing what Jitsi gives us as an SFU https://webrtchacks.com/atlassian-sfu-qa/
| |
| ## and this video comparing mesh vs sfu vs mcu https://webrtcglossary.com/mcu/
| |
| ## and this great article https://bloggeek.me/how-many-users-webrtc-call[[https://bloggeek.me/how-many-users-webrtc-call/|/]]
| |
| ## we might actually want to look into open source MCUs instead of an SFU. The downside is more processing on our server, but I think that'd be less of a bottleneck than each participant downloading n streams.
| |
| # responded to Marcin's validation follow-up email
| |
| ## (3) Chrome ERR_BLOCKED_BY_XSS_AUDITOR on Preview
| |
| ### marcin found another permission denied issue when attempting to view the preview of the FAQ page https://wiki.opensourceecology.org/index.php?title=FAQ&action=submit
| |
| #### this was a red herring; it's just another mod_security false-positive that's distinct from the chrome ERR_BLOCKED_BY_XSS_AUDITOR issue
| |
| #### I whitelisted 959071 = sqli, and I confirmed that fixed the issue
| |
| ### Marcin said he's running chromium Version 64.0.3282.167 (Official Build) Built on Ubuntu , running on Ubuntu 16.04 (64-bit)
| |
| ### My chromium is Version 57.0.2987.98 Built on 8.7, running on Debian 8.10 (64-bit)
| |
| ### I went ahead and updated my ose VM to debian 9, but chromium in debian 9 is still Version 57.0.2987.98 Built on 8.7, running on Debian 9.4 (64-bit)
| |
| ### the preview bug was supposed to be fixed in v58, but maybe the bug was just that it was blocking the page *and* the debugging. In any case, I can't see what's causing chrome to issue the ERR_BLOCKED_BY_XSS_AUDITOR
| |
| ### I spun up a disposable vm & installed the latest version of google chrome, but that failed
| |
| <pre>
| |
| [user@fedora-23-dvm ~]$ wget https://dl.google.com/linux/direct/google-chrome-stable_current_x86_64.rpm
| |
| --2018-04-24 13:38:31-- https://dl.google.com/linux/direct/google-chrome-stable_current_x86_64.rpm
| |
| Resolving dl.google.com (dl.google.com)... 172.217.23.174, 2a00:1450:4001:81f::200e
| |
| Connecting to dl.google.com (dl.google.com)|172.217.23.174|:443... connected.
| |
| HTTP request sent, awaiting response... 200 OK
| |
| Length: 52364199 (50M) [application/x-rpm]
| |
| Saving to: ‘google-chrome-stable_current_x86_64.rpm’
| |
| | |
| google-chrome-stable_current_x86 100%[=========================================================>] 49.94M 1.19MB/s in 47s
| |
| | |
| 2018-04-24 13:39:24 (1.06 MB/s) - ‘google-chrome-stable_current_x86_64.rpm’ saved [52364199/52364199]
| |
| | |
| [user@fedora-23-dvm ~]$ sudo dnf install google-chrome-stable_current_x86_64.rpm
| |
| Last metadata expiration check: 0:09:47 ago on Tue Apr 24 13:37:39 2018.
| |
| Error: nothing provides libssl3.so(NSS_3.28)(64bit) needed by google-chrome-stable-66.0.3359.117-1.x86_64
| |
| (try to add '--allowerasing' to command line to replace conflicting packages)
| |
| [user@fedora-23-dvm ~]$ sudo dnf --allowerasing install google-chrome-stable_current_x86_64.rpm
| |
| Last metadata expiration check: 0:10:35 ago on Tue Apr 24 13:37:39 2018.
| |
| Error: nothing provides libssl3.so(NSS_3.28)(64bit) needed by google-chrome-stable-66.0.3359.117-1.x86_64
| |
| </pre>
| |
| # fedora does have chromium in its repos; I installed them, but this was even more out-of-date @ Version 54.0.2840.90 Fedora Project (64-bit)
| |
| ### I found & installed the google-chrome depend (openssl-devel), but the same error occured. I found a fix of installing fedora 27 :| https://stackoverflow.com/questions/48839199/install-google-chrome-in-fedora-23
| |
| ### giving up on fedora 27, I created a new VM from debian-9 for installing google-chrome https://unix.stackexchange.com/questions/20614/how-do-i-install-the-latest-version-of-chromium-in-debian-squeeze
| |
| <pre>
| |
| root@google-chrome:~# vim /etc/apt/apt.conf.d/
| |
| 00notify-hook 20auto-upgrades 50appstream 70debconf
| |
| 01autoremove 20listchanges 50unattended-upgrades 70no-unattended
| |
| 01autoremove-kernels 20packagekit 60gnome-software
| |
| root@google-chrome:~# vim /etc/apt/apt.conf.d/^C
| |
| root@google-chrome:~# wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
| |
| --2018-04-24 14:29:16-- https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
| |
| Resolving dl.google.com (dl.google.com)... 172.217.23.174, 2a00:1450:4001:81f::200e
| |
| Connecting to dl.google.com (dl.google.com)|172.217.23.174|:443... connected.
| |
| HTTP request sent, awaiting response... 200 OK
| |
| Length: 52231002 (50M) [application/x-debian-package]
| |
| Saving to: ‘google-chrome-stable_current_amd64.deb’
| |
| | |
| google-chrome-stable_curre 100%[========================================>] 49.81M 1.19MB/s in 57s
| |
| | |
| 2018-04-24 14:30:24 (895 KB/s) - ‘google-chrome-stable_current_amd64.deb’ saved [52231002/52231002]
| |
| | |
| root@google-chrome:~# dpkg -i google-chrome-stable_current_amd64.deb
| |
| Selecting previously unselected package google-chrome-stable.
| |
| (Reading database ... 123078 files and directories currently installed.)
| |
| Preparing to unpack google-chrome-stable_current_amd64.deb ...
| |
| Unpacking google-chrome-stable (66.0.3359.117-1) ...
| |
| dpkg: dependency problems prevent configuration of google-chrome-stable:
| |
| google-chrome-stable depends on libappindicator3-1; however:
| |
| Package libappindicator3-1 is not installed.
| |
| | |
| dpkg: error processing package google-chrome-stable (--install):
| |
| dependency problems - leaving unconfigured
| |
| Processing triggers for man-db (2.7.6.1-2) ...
| |
| Processing triggers for qubes-core-agent (3.2.28-1+deb9u1) ...
| |
| Processing triggers for desktop-file-utils (0.23-1) ...
| |
| Processing triggers for mime-support (3.60) ...
| |
| Errors were encountered while processing:
| |
| google-chrome-stable
| |
| root@google-chrome:~#
| |
| </pre>
| |
| ### I installed it with fix broken
| |
| <pre>
| |
| apt-get --fix-broken install
| |
| </pre>
| |
| ### now I'm running Google Chrome Version 66.0.3359.117 (Official Build) (64-bit)
| |
| ### I loaded Marcin's log in this very-new-version of Google Chrome, logged in, clicked to edit the page, and to view the preview. I got the ERR_BLOCKED_BY_XSS_AUDITOR error again, and the console was still blank :(
| |
| ### As a test, I tried the same thing on our old site, and I got the issue again! That means this is an issue that's not relevant to our migration. If it hasn't been a major blocker before the migration, I think we should just mark this as "won't fix" with solution being "use firefox" for the few times this occurs
| |
| | |
| =Mon Apr 23, 2018=
| |
| # verified that our ec2 charges are still $0 in the aws console
| |
| # I created a dns entry for jitsi.opensourceecology.org on cloudflare pointing to 34.210.153.174 = our ec2 instance's public IP
| |
| # continuing with the jitsi manual install on our free ec2 t2.micro dev instance, I went to install jitsi meet; here's the easy part
| |
| <pre>
| |
| # download jitsi meet
| |
| pushd /var/www/html/jitsi.opensourceecology.org
| |
| git clone https://github.com/jitsi/jitsi-meet.git
| |
| mv htdocs htdocs.`date "+%Y%m%d_%H%M%S"`.old
| |
| mv "jitsi-meet" "htdocs"
| |
| pushd htdocs
| |
| </pre>
| |
| # but all the npm stuff fails, as Christian found. There's some mentions of this in the github, suggesting that some have made it work https://github.com/jitsi/jitsi-meet/search?o=desc&q=centos&s=created&type=Issues
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# yum install -y npm nodejs
| |
| ...
| |
| [root@ip-172-31-28-115 htdocs]# npm install
| |
| npm WARN deprecated nomnom@1.6.2: Package no longer supported. Contact support@npmjs.com for more info.
| |
| npm WARN deprecated connect@2.30.2: connect 2.x series is deprecated
| |
| Killed ..] / preinstall:url-polyfill: sill doParallel preinstall 855
| |
| [root@ip-172-31-28-115 htdocs]# make
| |
| ./node_modules/.bin/webpack -p
| |
| module.js:478
| |
| throw err;
| |
| ^
| |
| | |
| Error: Cannot find module 'babel-preset-react'
| |
| at Function.Module._resolveFilename (module.js:476:15)
| |
| at Function.resolve (internal/module.js:27:19)
| |
| at Object.<anonymous> (/var/www/html/jitsi.opensourceecology.org/htdocs/webpack.config.js:80:29)
| |
| at Module._compile (module.js:577:32)
| |
| at Object.Module._extensions..js (module.js:586:10)
| |
| at Module.load (module.js:494:32)
| |
| at tryModuleLoad (module.js:453:12)
| |
| at Function.Module._load (module.js:445:3)
| |
| at Module.require (module.js:504:17)
| |
| at require (internal/module.js:20:19)
| |
| make: *** [compile] Error 1
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # the source is line 80 of webpack.config.js, which has a line = "require.resolve('babel-preset-react')"
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# sed -n 70,90p /var/www/html/jitsi.opensourceecology.org/htdocs/webpack.config.js
| |
| // jitsi-meet. The require.resolve, of course, mandates the use
| |
| // of the prefix babel-preset- in the preset names.
| |
| presets: [
| |
| [
| |
| require.resolve('babel-preset-env'),
| |
| | |
| // Tell babel to avoid compiling imports into CommonJS
| |
| // so that webpack may do tree shaking.
| |
| { modules: false }
| |
| ],
| |
| require.resolve('babel-preset-react'),
| |
| require.resolve('babel-preset-stage-1')
| |
| ]
| |
| },
| |
| test: /\.jsx?$/
| |
| }, {
| |
| // Expose jquery as the globals $ and jQuery because it is expected
| |
| // to be available in such a form by multiple jitsi-meet
| |
| // dependencies including lib-jitsi-meet.
| |
| | |
| loader: 'expose-loader?$!expose-loader?jQuery',
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # these issues were specifically mentioned in this issue https://github.com/jitsi/jitsi-meet/issues/446
| |
| # I fixed this by manually installing the packages called-out
| |
| <pre>
| |
| npm install babel-preset-react
| |
| npm install babel-preset-stage-1
| |
| </pre>
| |
| # but then the next make had a different issue
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# make
| |
| ./node_modules/.bin/webpack -p
| |
| Hash: f38346765ad69b5229dc9fb40aa6056b410b3138
| |
| Version: webpack 3.9.1
| |
| Child
| |
| Hash: f38346765ad69b5229dc
| |
| Time: 20725ms
| |
| Asset Size Chunks Chunk Names
| |
| dial_in_info_bundle.min.js 90.5 kB 0 [emitted] dial_in_info_bundle
| |
| app.bundle.min.js 90.3 kB 1 [emitted] app.bundle
| |
| dial_in_info_bundle.min.map 735 kB 0 [emitted] dial_in_info_bundle
| |
| app.bundle.min.map 735 kB 1 [emitted] app.bundle
| |
| [90] (webpack)/buildin/global.js 509 bytes {0} {1} [built]
| |
| [327] multi babel-polyfill whatwg-fetch ./app.js 52 bytes {1} [built]
| |
| [328] multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page 64 bytes {0} [built]
| |
| + 326 hidden modules
| |
| | |
| ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| | |
| ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| | |
| ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| | |
| ERROR in multi babel-polyfill whatwg-fetch ./app.js
| |
| Module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| @ multi babel-polyfill whatwg-fetch ./app.js
| |
| | |
| ERROR in multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page
| |
| Module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| @ multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page
| |
| | |
| ERROR in multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page
| |
| Module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| @ multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page
| |
| | |
| ERROR in multi babel-polyfill whatwg-fetch ./app.js
| |
| Module not found: Error: Can't resolve 'whatwg-fetch' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| @ multi babel-polyfill whatwg-fetch ./app.js
| |
| | |
| ERROR in multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page
| |
| Module not found: Error: Can't resolve 'whatwg-fetch' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| @ multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page
| |
| Child
| |
| Hash: 9fb40aa6056b410b3138
| |
| Time: 20699ms
| |
| Asset Size Chunks Chunk Names
| |
| external_api.min.js 90.5 kB 0 [emitted] external_api
| |
| external_api.min.map 735 kB 0 [emitted] external_api
| |
| [90] (webpack)/buildin/global.js 509 bytes {0} [built]
| |
| [125] multi babel-polyfill ./modules/API/external/index.js 40 bytes {0} [built]
| |
| + 326 hidden modules
| |
| | |
| ERROR in multi babel-polyfill ./modules/API/external/index.js
| |
| Module not found: Error: Can't resolve 'babel-loader' in '/var/www/html/jitsi.opensourceecology.org/htdocs'
| |
| @ multi babel-polyfill ./modules/API/external/index.js
| |
| make: *** [compile] Error 2
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # fixed by installing babel-loader
| |
| <pre>
| |
| npm install babel-loader
| |
| make
| |
| </pre>
| |
| # the next failure lacked info..
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# make
| |
| ./node_modules/.bin/webpack -p
| |
| make: *** [compile] Killed
| |
| [root@ip-172-31-28-115 htdocs]# echo $?
| |
| 2
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| # I opened up port 80 & 443 on the server's security group in the aws console's ec2 service
| |
| # to simplify debugging, I changed the nginx config to use port 80, not 443.
| |
| # selinux had to be disabled to prevent "permission denied" errors as popped up in the logs:
| |
| <pre>
| |
| ==> /var/log/nginx/error.log <==
| |
| 2018/04/24 02:57:58 [error] 3101#0: *1 open() "/var/www/html/jitsi.opensourceecology.org/htdocs/index.html" failed (13: Permission denied), client: 76.97.223.185, server: jitsi.opensourceecology.org, request: "GET /index.html HTTP/1.1", host: "ec2-34-210-153-174.us-west-2.compute.amazonaws.com"
| |
| | |
| ==> /var/log/nginx/access.log <==
| |
| 76.97.223.185 - - [24/Apr/2018:02:57:58 +0000] "GET /index.html HTTP/1.1" 403 180 "-" "Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:50.0) Gecko/20100101 Firefox/50.0" "-"
| |
| </pre>
| |
| # I disabled selinux by setting it to Permissive & restarting nginx
| |
| <pre>
| |
| [root@ip-172-31-28-115 jitsi.opensourceecology.org]# getenforce
| |
| Enforcing
| |
| [root@ip-172-31-28-115 jitsi.opensourceecology.org]# setenforce Permissive
| |
| [root@ip-172-31-28-115 jitsi.opensourceecology.org]# service nginx restart
| |
| Redirecting to /bin/systemctl restart nginx.service
| |
| [root@ip-172-31-28-115 jitsi.opensourceecology.org]#
| |
| </pre>
| |
| # ok, this got a page load, but with an error
| |
| <pre>
| |
| Uh oh! We couldn't fully download everything we needed :(
| |
| We will try again shortly. In the mean time, check for problems with your Internet connection!
| |
| | |
| Missing http://ec2-34-210-153-174.us-west-2.compute.amazonaws.com/libs/lib-jitsi-meet.min.js?v=139
| |
| | |
| show less reload now
| |
| </pre>
| |
| # ah, the "Killed" issue is because linux is running out of memory
| |
| <pre>
| |
| [root@ip-172-31-28-115 ~]# tail -n 100 /var/log/messages
| |
| Apr 24 03:00:35 ip-172-31-28-115 dbus[519]: avc: received setenforce notice (enforcing=0)
| |
| Apr 24 03:00:35 ip-172-31-28-115 dbus-daemon: dbus[519]: avc: received setenforce notice (enforcing=0)
| |
| Apr 24 03:00:42 ip-172-31-28-115 systemd: Stopping The nginx HTTP and reverse proxy server...
| |
| Apr 24 03:00:42 ip-172-31-28-115 systemd: Starting The nginx HTTP and reverse proxy server...
| |
| Apr 24 03:00:42 ip-172-31-28-115 nginx: nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
| |
| Apr 24 03:00:42 ip-172-31-28-115 nginx: nginx: configuration file /etc/nginx/nginx.conf test is successful
| |
| Apr 24 03:00:42 ip-172-31-28-115 systemd: Failed to read PID from file /run/nginx.pid: Invalid argument
| |
| Apr 24 03:00:42 ip-172-31-28-115 systemd: Started The nginx HTTP and reverse proxy server.
| |
| Apr 24 03:01:01 ip-172-31-28-115 systemd: Started Session 88 of user root.
| |
| Apr 24 03:01:01 ip-172-31-28-115 systemd: Starting Session 88 of user root.
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: systemd invoked oom-killer: gfp_mask=0x280da, order=0, oom_score_adj=0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: systemd cpuset=/ mems_allowed=0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: CPU: 0 PID: 1 Comm: systemd Not tainted 3.10.0-693.11.6.el7.x86_64 #1
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Hardware name: Xen HVM domU, BIOS 4.2.amazon 08/24/2006
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Call Trace:
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816a5ea1>] dump_stack+0x19/0x1b
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816a1296>] dump_header+0x90/0x229
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff812b9dfb>] ? cred_has_capability+0x6b/0x120
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff81188094>] oom_kill_process+0x254/0x3d0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff812b9fde>] ? selinux_capable+0x2e/0x40
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff811888d6>] out_of_memory+0x4b6/0x4f0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816a1d9a>] __alloc_pages_slowpath+0x5d6/0x724
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff8118eaa5>] __alloc_pages_nodemask+0x405/0x420
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff811d6075>] alloc_pages_vma+0xb5/0x200
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816a924e>] ? __wait_on_bit+0x7e/0x90
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff811b42a0>] handle_mm_fault+0xb60/0xfa0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff8132f4b3>] ? number.isra.2+0x323/0x360
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816b37e4>] __do_page_fault+0x154/0x450
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816b3b15>] do_page_fault+0x35/0x90
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816af8f8>] page_fault+0x28/0x30
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff81332459>] ? copy_user_enhanced_fast_string+0x9/0x20
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff8122854b>] ? seq_read+0x2ab/0x3b0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff8120295c>] vfs_read+0x9c/0x170
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff8120381f>] SyS_read+0x7f/0xe0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [<ffffffff816b89fd>] system_call_fastpath+0x16/0x1b
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Mem-Info:
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: active_anon:214589 inactive_anon:4438 isolated_anon:0#012 active_file:36 inactive_file:783 isolated_file:0#012 unevictable:0 dirty:0 writeback:0 unstable:0#012 slab_reclaimable:4921 slab_unreclaimable:8167#012 mapped:879 shmem:6524 pagetables:2371 bounce:0#012 free:12232 free_pcp:118 free_cma:0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Node 0 DMA free:4588kB min:704kB low:880kB high:1056kB active_anon:10344kB inactive_anon:4kB active_file:0kB inactive_file:0kB unevictable:0kB isolated(anon):0kB isolated(file):0kB present:15988kB managed:15904kB mlocked:0kB dirty:0kB writeback:0kB mapped:0kB shmem:360kB slab_reclaimable:268kB slab_unreclaimable:476kB kernel_stack:48kB pagetables:140kB unstable:0kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB writeback_tmp:0kB pages_scanned:0 all_unreclaimable? yes
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: lowmem_reserve[]: 0 973 973 973
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Node 0 DMA32 free:44340kB min:44348kB low:55432kB high:66520kB active_anon:848012kB inactive_anon:17748kB active_file:144kB inactive_file:3132kB unevictable:0kB isolated(anon):0kB isolated(file):0kB present:1032192kB managed:998676kB mlocked:0kB dirty:0kB writeback:0kB mapped:3516kB shmem:25736kB slab_reclaimable:19416kB slab_unreclaimable:32192kB kernel_stack:2352kB pagetables:9344kB unstable:0kB bounce:0kB free_pcp:472kB local_pcp:472kB free_cma:0kB writeback_tmp:0kB pages_scanned:711 all_unreclaimable? yes
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: lowmem_reserve[]: 0 0 0 0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Node 0 DMA: 25*4kB (UEM) 19*8kB (UE) 19*16kB (UEM) 12*32kB (UE) 5*64kB (UE) 12*128kB (UEM) 3*256kB (M) 0*512kB 1*1024kB (E) 0*2048kB 0*4096kB = 4588kB
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Node 0 DMA32: 1281*4kB (UEM) 1040*8kB (UE) 593*16kB (UEM) 251*32kB (UE) 67*64kB (UE) 41*128kB (UEM) 9*256kB (UEM) 3*512kB (M) 0*1024kB 0*2048kB 0*4096kB = 44340kB
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=2048kB
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: 7345 total pagecache pages
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: 0 pages in swap cache
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Swap cache stats: add 0, delete 0, find 0/0
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Free swap = 0kB
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Total swap = 0kB
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: 262045 pages RAM
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: 0 pages HighMem/MovableOnly
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: 8400 pages reserved
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ pid ] uid tgid total_vm rss nr_ptes swapents oom_score_adj name
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 321] 0 321 8841 803 24 0 0 systemd-journal
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 519] 81 519 24607 167 18 0 -900 dbus-daemon
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 524] 0 524 135883 1481 84 0 0 NetworkManager
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 531] 0 531 6051 82 16 0 0 systemd-logind
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 562] 0 562 28343 3126 58 0 0 dhclient
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 1031] 0 1031 22386 260 43 0 0 master
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 1033] 89 1033 22429 253 46 0 0 qmgr
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 1139] 0 1139 27511 33 12 0 0 agetty
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 1140] 0 1140 27511 33 10 0 0 agetty
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 8587] 0 8587 11692 394 27 0 -1000 systemd-udevd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9607] 0 9607 28198 256 58 0 -1000 sshd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9722] 0 9722 13877 111 29 0 -1000 auditd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9760] 0 9760 58390 946 48 0 0 rsyslogd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9781] 0 9781 31571 159 18 0 0 crond
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9826] 999 9826 134633 1647 60 0 0 polkitd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9903] 0 9903 26991 38 9 0 0 rhnsd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9914] 0 9914 143438 3265 99 0 0 tuned
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 9985] 998 9985 25120 91 20 0 0 chronyd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [11651] 0 11651 32007 188 17 0 0 screen
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [11652] 0 11652 28891 135 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [12264] 1001 12264 1291983 15415 88 0 0 java
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [12516] 0 12516 47972 155 49 0 0 su
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [12517] 1001 12517 28859 103 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [12611] 0 12611 28859 105 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [27867] 0 27867 39164 336 80 0 0 sshd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [27870] 1000 27870 39361 552 78 0 0 sshd
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [27871] 1000 27871 28859 95 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [27892] 1000 27892 31908 66 18 0 0 screen
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [27893] 1000 27893 32113 296 16 0 0 screen
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [27894] 1000 27894 28859 115 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30326] 0 30326 54628 261 63 0 0 sudo
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30327] 0 30327 47972 155 49 0 0 su
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30328] 0 30328 28892 125 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30369] 1000 30369 28859 105 13 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30385] 0 30385 54628 261 61 0 0 sudo
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30386] 0 30386 47972 155 49 0 0 su
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [30387] 0 30387 28892 141 14 0 0 bash
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3017] 997 3017 24190 5474 48 0 0 lua
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3164] 0 3164 30201 523 54 0 0 nginx
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3165] 996 3165 30317 649 56 0 0 nginx
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3177] 0 3177 30814 57 15 0 0 anacron
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3180] 89 3180 22428 276 46 0 0 pickup
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3185] 0 3185 27066 60 9 0 0 make
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: [ 3186] 0 3186 469819 174655 759 0 0 node
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Out of memory: Kill process 3186 (node) score 670 or sacrifice child
| |
| Apr 24 03:07:57 ip-172-31-28-115 kernel: Killed process 3186 (node) total-vm:1879276kB, anon-rss:698620kB, file-rss:0kB, shmem-rss:0kB
| |
| Apr 24 03:10:36 ip-172-31-28-115 su: (to root) ec2-user on pts/5
| |
| [root@ip-172-31-28-115 ~]#
| |
| </pre>
| |
| # looks like we need a bigger box :\ or swap?
| |
| <pre>
| |
| [root@ip-172-31-28-115 ~]# free -m
| |
| total used free shared buff/cache available
| |
| Mem: 990 179 689 25 121 652
| |
| Swap: 0 0 0
| |
| [root@ip-172-31-28-115 ~]# df -h
| |
| Filesystem Size Used Avail Use% Mounted on
| |
| /dev/xvda2 10G 3.8G 6.3G 38% /
| |
| devtmpfs 476M 0 476M 0% /dev
| |
| tmpfs 496M 0 496M 0% /dev/shm
| |
| tmpfs 496M 26M 470M 6% /run
| |
| tmpfs 496M 0 496M 0% /sys/fs/cgroup
| |
| tmpfs 100M 0 100M 0% /run/user/0
| |
| tmpfs 100M 0 100M 0% /run/user/1000
| |
| [root@ip-172-31-28-115 ~]#
| |
| </pre>
| |
| # we have 1G of RAM and 6G of free disk space. I created a 2G swap file & enabled it
| |
| <pre>
| |
| [root@ip-172-31-28-115 ~]# dd if=/dev/zero of=/swap1 bs=1M count=2048
| |
| 2048+0 records in
| |
| 2048+0 records out
| |
| 2147483648 bytes (2.1 GB) copied, 30.2187 s, 71.1 MB/s
| |
| [root@ip-172-31-28-115 ~]# mkswap /swap1
| |
| Setting up swapspace version 1, size = 2097148 KiB
| |
| no label, UUID=a401ef50-ccb8-4bca-abeb-9de5a63b107c
| |
| [root@ip-172-31-28-115 ~]# chmod 0600 /swap1
| |
| [root@ip-172-31-28-115 ~]# swapon /swap1
| |
| [root@ip-172-31-28-115 ~]# free -m
| |
| total used free shared buff/cache available
| |
| Mem: 990 180 69 25 740 608
| |
| Swap: 2047 0 2047
| |
| [root@ip-172-31-28-115 ~]#
| |
| </pre>
| |
| # the next `make` run didn't get killed by oom, but it had a bunch of complaints of missing modules:
| |
| <pre>
| |
| ./node_modules/.bin/webpack -p
| |
| Hash: 63ec7d579fb8d7f2444256d53b2e2b409f23ea8a
| |
| Version: webpack 3.9.1
| |
| Child
| |
| Hash: 63ec7d579fb8d7f24442
| |
| Time: 72401ms
| |
| Asset Size Chunks Chunk Names
| |
| app.bundle.min.js 317 kB 0 [emitted] [big] app.bundle
| |
| dial_in_info_bundle.min.js 116 kB 1 [emitted] dial_in_info_bundle
| |
| device_selection_popup_bundle.min.js 101 kB 2 [emitted] device_selection_popup_bundle
| |
| do_external_connect.min.js 2 kB 3 [emitted] do_external_connect
| |
| alwaysontop.min.js 731 bytes 4 [emitted] alwaysontop
| |
| app.bundle.min.map 4.21 MB 0 [emitted] app.bundle
| |
| dial_in_info_bundle.min.map 912 kB 1 [emitted] dial_in_info_bundle
| |
| device_selection_popup_bundle.min.map 2.11 MB 2 [emitted] device_selection_popup_bundle
| |
| do_external_connect.min.map 19.4 kB 3 [emitted] do_external_connect
| |
| alwaysontop.min.map 39.2 kB 4 [emitted] alwaysontop
| |
| [43] ./react/features/base/config/index.js + 4 modules 13.2 kB {0} {1} {2} [built]
| |
| [70] ./react/features/base/config/parseURLParams.js 1.51 kB {0} {1} {2} {3} [built]
| |
| [122] ./react/features/base/config/getRoomName.js 761 bytes {0} {1} {2} {3} [built]
| |
| [176] ./react/features/base/react/prop-types-polyfill.js 227 bytes {0} {1} {2} [built]
| |
| [444] ./modules/settings/Settings.js 6.12 kB {0} [built]
| |
| [450] multi babel-polyfill whatwg-fetch ./app.js 52 bytes {0} [built]
| |
| [451] ./app.js + 2 modules 4.67 kB {0} [built]
| |
| [465] ./modules/UI/UI.js 35 kB {0} [built]
| |
| [480] ./react/index.web.js 1.06 kB {0} [built]
| |
| [481] ./react/features/device-selection/popup.js 476 bytes {2} [built]
| |
| [482] ./react/features/device-selection/DeviceSelectionPopup.js 12.3 kB {2} [built]
| |
| [483] ./react/features/always-on-top/index.js + 2 modules 20.3 kB {4} [built]
| |
| [484] multi babel-polyfill whatwg-fetch ./react/features/base/react/prop-types-polyfill.js ./react/features/invite/components/dial-in-info-page 64 bytes {1} [built]
| |
| [485] ./react/features/invite/components/dial-in-info-page/index.js + 2 modules 4 kB {1} [built]
| |
| [486] ./connection_optimization/do_external_connect.js 2.51 kB {3} [built]
| |
| + 472 hidden modules
| |
| | |
| ERROR in ./react/features/invite/components/AddPeopleDialog.web.js
| |
| Module not found: Error: Can't resolve '@atlaskit/avatar' in '/var/www/html/jitsi.opensourceecology.org/htdocs/react/features/invite/components'
| |
| @ ./react/features/invite/components/AddPeopleDialog.web.js 15:0-38
| |
| @ ./react/features/invite/components/index.js
| |
| @ ./react/features/invite/index.js
| |
| @ ./react/features/toolbox/components/Toolbox.web.js
| |
| @ ./react/features/toolbox/components/index.js
| |
| @ ./react/features/toolbox/index.js
| |
| @ ./conference.js
| |
| @ ./app.js
| |
| @ multi babel-polyfill whatwg-fetch ./app.js
| |
| | |
| ERROR in ./react/features/base/dialog/components/StatelessDialog.web.js
| |
| Module not found: Error: Can't resolve '@atlaskit/button' in '/var/www/html/jitsi.opensourceecology.org/htdocs/react/features/base/dialog/components'
| |
| @ ./react/features/base/dialog/components/StatelessDialog.web.js 9:0-55
| |
| @ ./react/features/base/dialog/components/index.js
| |
| @ ./react/features/base/dialog/index.js
| |
| @ ./modules/keyboardshortcut/keyboardshortcut.js
| |
| @ ./app.js
| |
| @ multi babel-polyfill whatwg-fetch ./app.js
| |
| ...
| |
| </pre>
| |
| # the log file above was huge; here's a grep for just the missing module bit
| |
| <pre>
| |
| [root@ip-172-31-28-115 htdocs]# grep -i 'module not found' make.log | cut -d' ' -f1-11 | sort -u
| |
| Module not found: Error: Can't resolve '@atlaskit/avatar'
| |
| Module not found: Error: Can't resolve '@atlaskit/button'
| |
| Module not found: Error: Can't resolve '@atlaskit/dropdown-menu'
| |
| Module not found: Error: Can't resolve '@atlaskit/field-text'
| |
| Module not found: Error: Can't resolve '@atlaskit/field-text-area'
| |
| Module not found: Error: Can't resolve '@atlaskit/flag'
| |
| Module not found: Error: Can't resolve '@atlaskit/icon/glyph/chevron-down'
| |
| Module not found: Error: Can't resolve '@atlaskit/icon/glyph/editor/info'
| |
| Module not found: Error: Can't resolve '@atlaskit/icon/glyph/error'
| |
| Module not found: Error: Can't resolve '@atlaskit/icon/glyph/star'
| |
| Module not found: Error: Can't resolve '@atlaskit/icon/glyph/star-filled'
| |
| Module not found: Error: Can't resolve '@atlaskit/icon/glyph/warning'
| |
| Module not found: Error: Can't resolve '@atlaskit/inline-dialog'
| |
| Module not found: Error: Can't resolve '@atlaskit/inline-message'
| |
| Module not found: Error: Can't resolve '@atlaskit/layer-manager'
| |
| Module not found: Error: Can't resolve '@atlaskit/lozenge'
| |
| Module not found: Error: Can't resolve '@atlaskit/modal-dialog'
| |
| Module not found: Error: Can't resolve '@atlaskit/multi-select'
| |
| Module not found: Error: Can't resolve '@atlaskit/spinner'
| |
| Module not found: Error: Can't resolve '@atlaskit/tabs'
| |
| Module not found: Error: Can't resolve '@atlaskit/theme'
| |
| Module not found: Error: Can't resolve '@atlaskit/tooltip'
| |
| Module not found: Error: Can't resolve 'autosize'
| |
| Module not found: Error: Can't resolve 'i18next'
| |
| Module not found: Error: Can't resolve 'i18next-browser-languagedetector'
| |
| Module not found: Error: Can't resolve 'i18next-xhr-backend'
| |
| Module not found: Error: Can't resolve 'jitsi-meet-logger'
| |
| Module not found: Error: Can't resolve 'jquery'
| |
| Module not found: Error: Can't resolve 'jquery-contextmenu'
| |
| Module not found: Error: Can't resolve 'jquery-i18next'
| |
| Module not found: Error: Can't resolve 'jQuery-Impromptu'
| |
| Module not found: Error: Can't resolve 'js-md5'
| |
| Module not found: Error: Can't resolve 'jwt-decode'
| |
| Module not found: Error: Can't resolve 'lodash'
| |
| Module not found: Error: Can't resolve 'lodash/debounce'
| |
| Module not found: Error: Can't resolve 'lodash/throttle'
| |
| Module not found: Error: Can't resolve 'moment'
| |
| Module not found: Error: Can't resolve 'moment/locale/bg'
| |
| Module not found: Error: Can't resolve 'moment/locale/de'
| |
| Module not found: Error: Can't resolve 'moment/locale/eo'
| |
| Module not found: Error: Can't resolve 'moment/locale/es'
| |
| Module not found: Error: Can't resolve 'moment/locale/fr'
| |
| Module not found: Error: Can't resolve 'moment/locale/hy-am'
| |
| Module not found: Error: Can't resolve 'moment/locale/it'
| |
| Module not found: Error: Can't resolve 'moment/locale/nb'
| |
| Module not found: Error: Can't resolve 'moment/locale/pl'
| |
| Module not found: Error: Can't resolve 'moment/locale/pt'
| |
| Module not found: Error: Can't resolve 'moment/locale/pt-br'
| |
| Module not found: Error: Can't resolve 'moment/locale/ru'
| |
| Module not found: Error: Can't resolve 'moment/locale/sk'
| |
| Module not found: Error: Can't resolve 'moment/locale/sl'
| |
| Module not found: Error: Can't resolve 'moment/locale/sv'
| |
| Module not found: Error: Can't resolve 'moment/locale/tr'
| |
| Module not found: Error: Can't resolve 'moment/locale/zh-cn'
| |
| Module not found: Error: Can't resolve 'postis'
| |
| Module not found: Error: Can't resolve 'prop-types'
| |
| Module not found: Error: Can't resolve 'react'
| |
| Module not found: Error: Can't resolve 'react-dom'
| |
| Module not found: Error: Can't resolve 'react-i18next'
| |
| Module not found: Error: Can't resolve 'react-redux'
| |
| Module not found: Error: Can't resolve 'redux'
| |
| Module not found: Error: Can't resolve 'redux-thunk'
| |
| Module not found: Error: Can't resolve 'url-polyfill'
| |
| Module not found: Error: Can't resolve 'whatwg-fetch'
| |
| [root@ip-172-31-28-115 htdocs]#
| |
| </pre>
| |
| | |
| =Fri Apr 20, 2018=
| |
| # I terminated my free-tier ec2 instance last night so I could ensure that it cost $0 before leaving it running. I checked the bill this morning, and I saw ec2 pop into our bill. The total was $0 for 4 hours of RHEL t2.micro instance-hours under monthly free-tier + $0 for 0.040 GB-Mo of EBS General Purpose SSD provisioned storage under monthly free tier. The only other fee I could imagine us getting is for general Datom
| |
| a Transfer. Currently we're at 348.233G (mostly from huge files into glacier). All of that is $0 under free tier so far.
| |
| # relaunched the instance with higher confidence that it's totally $0 = i-05a5af8b75bb5a0d9
| |
| # connected to the new instance over ssh
| |
| <pre>
| |
| user@ose:~$ ssh -p 22 -i .ssh/id_rsa.ose ec2-user@ec2-34-210-153-174.us-west-2.compute.amazonaws.com
| |
| The authenticity of host 'ec2-34-210-153-174.us-west-2.compute.amazonaws.com (34.210.153.174)' can't be established.
| |
| ECDSA key fingerprint is 89:74:84:57:64:c4:9a:71:fd:8d:9d:22:59:3c:d2:4d.
| |
| Are you sure you want to continue connecting (yes/no)? yes
| |
| Warning: Permanently added 'ec2-34-210-153-174.us-west-2.compute.amazonaws.com,34.210.153.174' (ECDSA) to the list of known hosts.
| |
| [ec2-user@ip-172-31-28-115 ~]$
| |
| </pre>
| |
| # fixed various issues existing jitsi install documentation Jitis
| |
| # added nginx to jitsi install documentation
| |
| <pre>
| |
| # install it from the repos
| |
| yum install -y nginx
| |
| | |
| # create config file for jitsi.opensourceecology.org
| |
| mkdir -p /var/www/html/jitsi.opensourceecology.org/htdocs
| |
| cat << EOF > /etc/nginx/conf.d/jitsi.opensourceecology.org
| |
| server_names_hash_bucket_size 64;
| |
| | |
| server {
| |
| listen 443;
| |
| # tls configuration that is not covered in this guide
| |
| # we recommend the use of https://certbot.eff.org/
| |
| server_name jitsi.opensourceecology.org;
| |
| # set the root
| |
| root /var/www/html/jitsi.opensourceecology.org/htdocs;
| |
| index index.html;
| |
| location ~ ^/([a-zA-Z0-9=\?]+)$ {
| |
| rewrite ^/(.*)$ / break;
| |
| }
| |
| location / {
| |
| ssi on;
| |
| }
| |
| # BOSH
| |
| location /http-bind {
| |
| proxy_pass http://localhost:5280/http-bind;
| |
| proxy_set_header X-Forwarded-For $remote_addr;
| |
| proxy_set_header Host $http_host;
| |
| }
| |
| }
| |
| EOF
| |
| </pre>
| |
| # successfully installed maven from source https://xmodulo.com/how-to-install-maven-on-centos.html
| |
| <pre>
| |
| wget http://mirror.metrocast.net/apache/maven/maven-3/3.5.3/binaries/apache-maven-3.5.3-bin.tar.gz
| |
| tar -xzvf apache-maven-*.tar.gz -C /usr/local
| |
| pushd /usr/local
| |
| ln -s apache-maven-* maven
| |
| </pre>
| |
| # version of maven was verified as 3.5.3
| |
| <pre>
| |
| [root@ip-172-31-28-115 local]# /usr/local/maven/bin/mvn -v
| |
| Apache Maven 3.5.3 (3383c37e1f9e9b3bc3df5050c29c8aff9f295297; 2018-02-24T19:49:05Z)
| |
| Maven home: /usr/local/maven
| |
| Java version: 1.8.0_171, vendor: Oracle Corporation
| |
| Java home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.171-7.b10.el7.x86_64/jre
| |
| Default locale: en_US, platform encoding: UTF-8
| |
| OS name: "linux", version: "3.10.0-693.11.6.el7.x86_64", arch: "amd64", family: "unix"
| |
| [root@ip-172-31-28-115 local]#
| |
| </pre>
| |
| | |
| =Thr Apr 19, 2018=
| |
| # I'm still waiting for Marcin to validate the staging wiki & send me the first draft of our migration-day test plan google doc
| |
| # I still don't want to do backups to s3 yet, as I want to migrate the wiki first so we reduce our backup size by ~12G per day once hetzner1 becomes near-0.
| |
| # I checked our usage on dreamhost, we're at 132G. Hopefully that flies under the radar until we finish the wiki migration
| |
| # After those blocked items, jitsi is probably my most important task
| |
| ## I started this item by building a local VM running centos7 http://www.gtlib.gatech.edu/pub/centos/7/isos/x86_64/CentOS-7-x86_64-DVD-1708.iso
| |
| ## I looked into software that's built-on jitsi to see if maybe there was a project that bundled what we need into stable builds designed for centos (because everything produced by jitsi is intended for debian) https://jitsi.org/built-on-jitsi/
| |
| ### I found an Administrator's guide to "Jitis Video Bridge" for the Rocket Chat project that had instructions for enabling Video Bridge because their video chat uses p2p WebRTC, which won't scale beyond 3 users or so. But then there's a seperate section to "set up yur own Jitsi Video Bridge" which indicates that when you enable the video bridge function in Rocket Chat, it just uses jitsi.org's infrastructure. To run the Video Bridge yourself, they simply link you back to the Jitsi source/docs, which are best for Debian https://rocket.chat/docs/administrator-guides/jitsi-video-bridge/
| |
| ### The Matrix project appears to be designed for 1:1 calls, not conference calls https://matrix.org/docs/guides/faq.html
| |
| ### Atlassian's not-free Stride platform apparently uses Jitsi, and it states "Stride video conferencing is free for unlimited teammates." Now, idk how far it scales, but I do know that Atlassian does provide some their products to non-profits for free. I emailed Marcin asking for a scanned copy of our income tax exemption letter for proof so I can ask Atlassian if they offer Stride for free to non-profits https://www.stride.com/conferencing#
| |
| ## found the quick install guide for debian https://github.com/jitsi/jitsi-meet/blob/master/doc/quick-install.md
| |
| ## found the server install from source for other OSs https://github.com/jitsi/jitsi-meet/blob/master/doc/manual-install.md
| |
| ## found the mailing list for jitsi users http://lists.jitsi.org/pipermail/users/
| |
| ## found the mailing list for jitsi devs http://lists.jitsi.org/pipermail/dev/
| |
| ## the search function of the lists.jitsi.org site sucks; google is better https://www.google.com/search?q="centos"+site%3Alists.jitsi.org&oq="centos"+site%3Alists.jitsi.org
| |
| ### I only got 1 result regarding the 'speex' dependency, which isn't in the yum repos; the solution was given http://lists.jitsi.org/pipermail/dev/2017-November/035909.html
| |
| ## I installed my VM as "Server with GUI"
| |
| ## The Template VM install failed; apparently CentOS isn't supported, and the guide to using a non-supported OS (written for Arch) shows this process is very non-trivial
| |
| ## I installed the "Server with GUI" on an HVM, but that didn't come up after rebooting after the install
| |
| <pre>
| |
| NMI watchdog: BUG: soft lockup - CPU#X stuck for Ys!
| |
| </pre>
| |
| ## I installed another HVM as minimal, but it had the same issue!
| |
| ## I checked our aws account, and we should get 750 hours of t2.micro instances per month. That's enough to run it 24/7. This item expires 12 months from our signup date, so it's use it or loose it. I'm going to use it.
| |
| ## I added my public key to the aws ec2 service from the aws console & named it 'maltfield'
| |
| ## I changed the default security group to only allow port 22 TCP inbound. outbound is completely open.
| |
| ## I launched a t2.micro node (i-065bfc806b4f39923) with this security group
| |
| ## I used the default 10G EBS, but I confirmed that for 12 months we get 30G of EBS storage for free.
| |
| ## confirmed that I could log in via ssh
| |
| <pre>
| |
| user@ose:~$ ssh -p 22 -i .ssh/id_rsa.ose ec2-user@ec2-52-32-28-0.us-west-2.compute.amazonaws.com
| |
| Enter passphrase for key '.ssh/id_rsa.ose':
| |
| | |
| user@ose:~$ ssh -p 22 -i .ssh/id_rsa.ose ec2-user@ec2-52-32-28-0.us-west-2.compute.amazonaws.com
| |
| Last login: Fri Apr 20 00:40:02 2018 from c-76-97-223-185.hsd1.ga.comcast.net
| |
| [ec2-user@ip-172-31-29-174 ~]$ hostname
| |
| ip-172-31-29-174.us-west-2.compute.internal
| |
| [ec2-user@ip-172-31-29-174 ~]$ ip a
| |
| 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN qlen 1
| |
| link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
| |
| inet 127.0.0.1/8 scope host lo
| |
| valid_lft forever preferred_lft forever
| |
| inet6 ::1/128 scope host
| |
| valid_lft forever preferred_lft forever
| |
| 2: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 9001 qdisc pfifo_fast state UP qlen 1000
| |
| link/ether 02:55:43:fa:bf:b2 brd ff:ff:ff:ff:ff:ff
| |
| inet 172.31.29.174/20 brd 172.31.31.255 scope global dynamic eth0
| |
| valid_lft 3015sec preferred_lft 3015sec
| |
| inet6 fe80::55:43ff:fefa:bfb2/64 scope link
| |
| valid_lft forever preferred_lft forever
| |
| [ec2-user@ip-172-31-29-174 ~]$ date
| |
| Fri Apr 20 00:40:38 UTC 2018
| |
| [ec2-user@ip-172-31-29-174 ~]$ pwd
| |
| /home/ec2-user
| |
| [ec2-user@ip-172-31-29-174 ~]$
| |
| </pre>
| |
| ## created a wiki page titled "Jitsi" here's what I have for the install so farl
| |
| <pre>
| |
| # become root
| |
| sudo su -
| |
| | |
| # first, update software
| |
| yum update
| |
| | |
| # install my prereqs
| |
| yum install -y vim screen wget unzip
| |
| | |
| # enable epel
| |
| cat << EOF > /etc/yum.repos.d/epel.repo
| |
| [epel]
| |
| name=Extra Packages for Enterprise Linux 7 - \$basearch
| |
| #baseurl=http://download.fedoraproject.org/pub/epel/7/\$basearch
| |
| metalink=https://mirrors.fedoraproject.org/metalink?repo=epel-7&arch=\$basearch
| |
| failovermethod=priority
| |
| enabled=1
| |
| gpgcheck=1
| |
| gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7
| |
| | |
| [epel-debuginfo]
| |
| name=Extra Packages for Enterprise Linux 7 - \$basearch - Debug
| |
| #baseurl=http://download.fedoraproject.org/pub/epel/7/\$basearch/debug
| |
| metalink=https://mirrors.fedoraproject.org/metalink?repo=epel-debug-7&arch=\$basearch
| |
| failovermethod=priority
| |
| enabled=0
| |
| gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7
| |
| gpgcheck=1
| |
| | |
| [epel-source]
| |
| name=Extra Packages for Enterprise Linux 7 - \$basearch - Source
| |
| #baseurl=http://download.fedoraproject.org/pub/epel/7/SRPMS
| |
| metalink=https://mirrors.fedoraproject.org/metalink?repo=epel-source-7&arch=\$basearch
| |
| failovermethod=priority
| |
| enabled=0
| |
| gpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7
| |
| gpgcheck=1
| |
| EOF
| |
| | |
| # and epel key
| |
| cat << EOF > /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-7
| |
| -----BEGIN PGP PUBLIC KEY BLOCK-----
| |
| Version: GnuPG v1.4.11 (GNU/Linux)
| |
| | |
| mQINBFKuaIQBEAC1UphXwMqCAarPUH/ZsOFslabeTVO2pDk5YnO96f+rgZB7xArB
| |
| OSeQk7B90iqSJ85/c72OAn4OXYvT63gfCeXpJs5M7emXkPsNQWWSju99lW+AqSNm
| |
| jYWhmRlLRGl0OO7gIwj776dIXvcMNFlzSPj00N2xAqjMbjlnV2n2abAE5gq6VpqP
| |
| vFXVyfrVa/ualogDVmf6h2t4Rdpifq8qTHsHFU3xpCz+T6/dGWKGQ42ZQfTaLnDM
| |
| jToAsmY0AyevkIbX6iZVtzGvanYpPcWW4X0RDPcpqfFNZk643xI4lsZ+Y2Er9Yu5
| |
| S/8x0ly+tmmIokaE0wwbdUu740YTZjCesroYWiRg5zuQ2xfKxJoV5E+Eh+tYwGDJ
| |
| n6HfWhRgnudRRwvuJ45ztYVtKulKw8QQpd2STWrcQQDJaRWmnMooX/PATTjCBExB
| |
| 9dkz38Druvk7IkHMtsIqlkAOQMdsX1d3Tov6BE2XDjIG0zFxLduJGbVwc/6rIc95
| |
| T055j36Ez0HrjxdpTGOOHxRqMK5m9flFbaxxtDnS7w77WqzW7HjFrD0VeTx2vnjj
| |
| GqchHEQpfDpFOzb8LTFhgYidyRNUflQY35WLOzLNV+pV3eQ3Jg11UFwelSNLqfQf
| |
| uFRGc+zcwkNjHh5yPvm9odR1BIfqJ6sKGPGbtPNXo7ERMRypWyRz0zi0twARAQAB
| |
| tChGZWRvcmEgRVBFTCAoNykgPGVwZWxAZmVkb3JhcHJvamVjdC5vcmc+iQI4BBMB
| |
| AgAiBQJSrmiEAhsPBgsJCAcDAgYVCAIJCgsEFgIDAQIeAQIXgAAKCRBqL66iNSxk
| |
| 5cfGD/4spqpsTjtDM7qpytKLHKruZtvuWiqt5RfvT9ww9GUUFMZ4ZZGX4nUXg49q
| |
| ixDLayWR8ddG/s5kyOi3C0uX/6inzaYyRg+Bh70brqKUK14F1BrrPi29eaKfG+Gu
| |
| MFtXdBG2a7OtPmw3yuKmq9Epv6B0mP6E5KSdvSRSqJWtGcA6wRS/wDzXJENHp5re
| |
| 9Ism3CYydpy0GLRA5wo4fPB5uLdUhLEUDvh2KK//fMjja3o0L+SNz8N0aDZyn5Ax
| |
| CU9RB3EHcTecFgoy5umRj99BZrebR1NO+4gBrivIfdvD4fJNfNBHXwhSH9ACGCNv
| |
| HnXVjHQF9iHWApKkRIeh8Fr2n5dtfJEF7SEX8GbX7FbsWo29kXMrVgNqHNyDnfAB
| |
| VoPubgQdtJZJkVZAkaHrMu8AytwT62Q4eNqmJI1aWbZQNI5jWYqc6RKuCK6/F99q
| |
| thFT9gJO17+yRuL6Uv2/vgzVR1RGdwVLKwlUjGPAjYflpCQwWMAASxiv9uPyYPHc
| |
| ErSrbRG0wjIfAR3vus1OSOx3xZHZpXFfmQTsDP7zVROLzV98R3JwFAxJ4/xqeON4
| |
| vCPFU6OsT3lWQ8w7il5ohY95wmujfr6lk89kEzJdOTzcn7DBbUru33CQMGKZ3Evt
| |
| RjsC7FDbL017qxS+ZVA/HGkyfiu4cpgV8VUnbql5eAZ+1Ll6Dw==
| |
| =hdPa
| |
| -----END PGP PUBLIC KEY BLOCK-----
| |
| EOF
| |
| | |
| # update again
| |
| yum update
| |
| | |
| #######
| |
| # prosody #
| |
| | |
| # install jitsi prereqs
| |
| yum install -y prosody
| |
| | |
| # configure prosody
| |
| mkdir -p /etc/prosody/conf.avail/
| |
| cat << EOF > /etc/prosody/conf.avail/jitsi.opensourceecology.org.cfg.lua
| |
| VirtualHost "jitsi.opensourceecology.org"
| |
| authentication = "anonymous"
| |
| ssl = {
| |
| key = "/var/lib/prosody/jitsi.opensourceecology.org.key";
| |
| certificate = "/var/lib/prosody/jitsi.opensourceecology.org.crt";
| |
| }
| |
| modules_enabled = {
| |
| "bosh";
| |
| "pubsub";
| |
| }
| |
| c2s_require_encryption = false
| |
| | |
| VirtualHost "auth.jitsi.opensourceecology.org"
| |
| ssl = {
| |
| key = "/var/lib/prosody/auth.jitsi.opensourceecology.org.key";
| |
| certificate = "/var/lib/prosody/auth.jitsi.opensourceecology.org.crt";
| |
| }
| |
| authentication = "internal_plain"
| |
| | |
| admins = { "focus@auth.jitsi.opensourceecology.org" }
| |
| | |
| Component "conference.jitsi.example.com" "muc"
| |
| Component "jitsi-videobridge.jitsi.opensourceecology.org"
| |
| component_secret = "YOURSECRET1"
| |
| Component "focus.jitsi.opensourceecology.org"
| |
| component_secret = "YOURSECRET2"
| |
| EOF
| |
|
| |
| ln -s /etc/prosody/conf.avail/jitsi.opensourceecology.org.cfg.lua /etc/prosody/conf.d/jitsi.opensourceecology.org.cfg.lua
| |
|
| |
| prosodyctl cert generate jitsi.opensourceecology.org
| |
| prosodyctl cert generate auth.jitsi.opensourceecology.org
| |
|
| |
| mkdir -p /usr/local/share/ca-certificates
| |
| ln -sf /var/lib/prosody/auth.jitsi.opensourceecology.org.crt /usr/local/share/ca-certificates/auth.jitsi.opensourceecology.org.crt
| |
|
| |
| # this binary doesn't exist; TODO: find out if it's necessary?
| |
| update-ca-certificates -f
| |
|
| |
| prosodyctl register focus auth.jitsi.opensourceecology.org YOURSECRET3
| |
|
| |
| #########
| |
| # NGINX #
| |
| #########
| |
|
| |
| TODO
| |
|
| |
| #####################
| |
| # Jitsi Videobridge #
| |
| #####################
| |
|
| |
| # install depends
| |
| yum install -y java-1.8.0-openjdk
| |
|
| |
| wget https://download.jitsi.org/jitsi-videobridge/linux/jitsi-videobridge-linux-x64-1053.zip
| |
| unzip jitsi-videobridge-linux-x64-1053.zip
| |
|
| |
| cat << EOF > /home/ec2-user/.sip-communicator
| |
| org.jitsi.impl.neomedia.transform.srtp.SRTPCryptoContext.checkReplay=false
| |
| EOF
| |
|
| |
| chown ec2-user:ec2-user /home/ec2-user/.sip-communicator
| |
|
| |
| ./jvb.sh --host=localhost --domain=jitsi.opensourceecology.org --port=5347 --secret=YOURSECRET1
| |
| </pre>
| |
| # terminated the instance; I'll check to see what the budget says in a couple days to make sure today's testing was totally free
| |
| | |
| =Mon Apr 16, 2018=
| |
| # fixed a false-positive mod_security issue that Catarina hit while trying to update obi. id = 981318, sqli
| |
| # updated my log & the current meeting doc
| |
| | |
| =Sun Apr 15, 2018=
| |
| # Marcin got back to me pointing out a few differences between the staging & prod wikis:
| |
| <blockquote>
| |
| 1. Color on second graph of https://wiki.opensourceecology.org/wiki/Marcin_Log are different than original. Not sure if that matters.
| |
| 2. Spacing is added after every main heading section in ephemeral. For example, https://wiki.opensourceecology.org/wiki/AbeAnd_Log. Or my log.
| |
| 3. Error upon saving changes to my log. https://wiki.opensourceecology.org/index.php?title=Marcin_Log&action=submit . See screenshot, Point3.png.
| |
| 4. I tried 3 again. It works. But Preview doesn't work - see screenshot. When I hit show preview, it gives me an error. See 2 screenshots Point4.png.
| |
| 5. Picture upload doesn't work - see Point5 which I tried to upload from my log.
| |
| </blockquote>
| |
| # my responses below for each
| |
| ## (1) graph color
| |
| ### the color appears to change on every refresh; this appears to be how Lex coded it
| |
| #### http://opensourceecology.org/wiki/Development_Team_Effort
| |
| #### https://wiki.opensourceecology.org/wiki/Development_Team_Effort
| |
| ## (2) header spacing
| |
| ### the element picker in the firefox debugger shows that there's an element defined as ".mw-body-content h1" with a "margin-top: 1em;" on our new wiki, but not the old wiki. When I check for the same thing on wikipedia, I see they also have a "margin-top: 1em;"
| |
| ### I found the actual line to be in htdocs/skins/Vector/components/common.less
| |
| <pre>
| |
| .mw-body-content {
| |
| position: relative;
| |
| line-height: @content-line-height;
| |
| font-size: @content-font-size;
| |
| z-index: 0;
| |
|
| |
| p {
| |
| line-height: inherit;
| |
| margin: 0.5em 0;
| |
| }
| |
| h1 {
| |
| margin-top: 1em;
| |
| }
| |
|
| |
| h2 {
| |
| font-size: 1.5em;
| |
| margin-top: 1em;
| |
| }
| |
| </pre>
| |
| ### the oldest commit of this file in gerrit (from 2014-08-07) does *not* have this margin, so we should be able to isolate when it was added https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/skins/Vector/+/d28f09df312edbb72b19ac6ac5d124f11007a4ba/components/common.less
| |
| ### ok, I isolated it to this change on 2015-05-23 https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/skins/Vector/+/35ca341ed4e9acfa00505e216d2416f73c253948%5E%21/#F0
| |
| <blockquote>
| |
| Minor header fixes for Typography Refresh
| |
| | |
| This fixes top-margin for H1 inside .mw-body-content, and increase
| |
| font-size for H3 from 1.17em to 1.2em (bumping to 17px default).
| |
| | |
| Final patch for this bug.
| |
| | |
| Bug: T66653
| |
| ### the bug referenced is this one https://phabricator.wikimedia.org/T66653
| |
| Change-Id: I1e75bc4fc3e04ca6c9238d4ce116136e9bafacd1
| |
| </blockquote>
| |
| ### I recommended to Marcin that we just keep the theme defaults the same as what Wikipedia uses
| |
| ## (3) Request Entity Too Large
| |
| ### I confirmed that I got this error when attempting to edit a very long article = my log
| |
| ### the response I got was an error 413 Request Entity Too Large
| |
| <blockquote>
| |
| Request Entity Too Large
| |
| The requested resource
| |
| /index.php
| |
| does not allow request data with POST requests, or the amount of data provided in the request exceeds the capacity limit.
| |
| </blockquote>
| |
| ### at the same time, this popped into my logs
| |
| <pre>
| |
| [root@hetzner2 httpd]# tail -f wiki.opensourceecology.org/access_log wiki.opensourceecology.org/error_log error_log
| |
| ...
| |
| => wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:18:02.842032 2018] [:error] [pid 32271] [client 127.0.0.1] ModSecurity: Request body no files data length is larger than the configured limit (131072).. Deny with code (413) [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNfGhEne29xOSpIJVtJbQAAAAg"]
| |
| | |
| ==> wiki.opensourceecology.org/access_log <==
| |
| 127.0.0.1 - - [15/Apr/2018:14:18:02 +0000] "POST /index.php?title=Maltfield_log_2018&action=submit HTTP/1.0" 413 338 "https://wiki.opensourceecology.org/index.php?title=Maltfield_log_2018&action=submit" "Mozilla/5.0 (X11; OpenBSD amd64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.82 Safari/537.36"
| |
| </pre>
| |
| ### I fixed this by setting the SecRequestBodyNoFilesLimit to 1MB in "/etc/httpd/conf.d/00-wiki.opensourceecology.org.conf"
| |
| <pre>
| |
| # disable mod_security with rules as needed
| |
| # (found by logs in: /var/log/httpd/modsec_audit.log)
| |
| <IfModule security2_module>
| |
| SecRuleRemoveById 960015 960024 960904 960015 960017 970901 950109 981172 981231 981245 973338 973306 950901 981317 959072 981257 981243 958030 973300 973304 973335 973333 973316 200004 973347 981319 981240 973301 973344 960335 960020 950120 959073 981244 981248 981253 973334 973332 981242 981246 960915 200003 981173 981318 981260 950911 973302 973324 973317 981255 958057 958056 973327 950018 950001 958008 973329
| |
|
| |
| # set the (sans file) POST size limit to 1M (default is 128K)
| |
| SecRequestBodyNoFilesLimit 1000000
| |
| </IfModule>
| |
| </pre>
| |
| ### but now I got a 403 forbidden false-positive generic attack; it saw "wget" in my own log file :\
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:38:12.390804 2018] [:error] [pid 20103] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:[\\\\;\\\\|\\\\`]\\\\W*?\\\\bcc|\\\\b(wget|curl))\\\\b|\\\\/cc(?:[\\\\'\\"\\\\|\\\\;\\\\`\\\\-\\\\s]|$))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_40_generic_attacks.conf"] [line "25"] [id "950907"] [rev "2"] [msg "System Command Injection"] [data "Matched Data: wget found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I checked again just a..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "9"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/COMMAND_INJECTION"] [tag "WASCTC/WASC-31"] [tag "OWASP_TOP_10/A1"] [tag "PCI/6.5. [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNj1GTPX8kx9Zp-CaA3zQAAAAM"]
| |
| | |
| ==> wiki.opensourceecology.org/access_log <==
| |
| 127.0.0.1 - - [15/Apr/2018:14:38:12 +0000] "POST /index.php?title=Maltfield_log_2018&action=submit HTTP/1.0" 403 211 "https://wiki.opensourceecology.org/index.php?title=Maltfield_log_2018&action=submit" "Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:48.0) Gecko/20100101 Firefox/48.0"
| |
| </pre>
| |
| ### I whitelisted "950907", but I got another. I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:39:55.629972 2018] [:error] [pid 20239] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "[\\\\n\\\\r](?:content-(type|length)|set-cookie|location):" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_40_generic_attacks.conf"] [line "134"] [id "950910"] [rev "2"] [msg "HTTP Response Splitting Attack"] [data "Matched Data: \\x0acontent-type: found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0amy work log from the year 2018. i intentionally made this verbose to make future admin's work easier when troubleshooting. the more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future ose sysadmin.\\x0d\\x0a\\x0d\\x0a=see also=\\x0d\\x0a# maltfield_log\\x0d\\x0a# user:maltfield\\x0d\\x0a# special:contributions/maltfield\\x0d\\x0a\\x0d\\x0a=sun apr 08, 2018=\\x0d\\x0a# i checked..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "9"] [accuracy "9"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNkO8vleiEPKc6tRLX2PgAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "950005"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:46:24.788412 2018] [:error] [pid 21166] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?:\\\\b(?:\\\\.(?:ht(?:access|passwd|group)|www_?acl)|global\\\\.asa|httpd\\\\.conf|boot\\\\.ini)\\\\b|\\\\/etc\\\\/)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_40_generic_attacks.conf"] [line "205"] [id "950005"] [rev "3"] [msg "Remote File Access Attempt"] [data "Matched Data: /etc/ found within ARGS:wpTextbox1: test1 my work log from the year 2018. i intentionally made this verbose to make future admins work easier when troubleshooting. the more keywords error messages etc that are listed in this log the more helpful it will be for the future ose sysadmin. =see also= # maltfield_log # user:maltfield # special:contributions/maltfield =sun apr 08 2018= # i checked again just after midnight the retry appears to have worked pretty great. just 2 archives ..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "9"] [accuracy "9"] [tag "OWASP_CRS/WEB_ATTACK/FILE_INJECTION"] [tag "WASCTC/WASC-33"] [tag "OWASP_TOP_10 [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNlwEB2obt3oWmzHguGRAAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "950006"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:48:52.180898 2018] [:error] [pid 21264] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?:\\\\b(?:(?:n(?:et(?:\\\\b\\\\W+?\\\\blocalgroup|\\\\.exe)|(?:map|c)\\\\.exe)|t(?:racer(?:oute|t)|elnet\\\\.exe|clsh8?|ftp)|(?:w(?:guest|sh)|rcmd|ftp)\\\\.exe|echo\\\\b\\\\W*?\\\\by+)\\\\b|c(?:md(?:(?:\\\\.exe|32)\\\\b|\\\\b\\\\W*?\\\\/c)|d(?:\\\\b\\\\W*?[\\\\/]|\\\\W*?\\\\.\\\\.)|hmod.{0,40}?\\\\ ..." at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_40_generic_attacks.conf"] [line "221"] [id "950006"] [rev "3"] [msg "System Command Injection"] [data "Matched Data: `echo found within ARGS:wpTextbox1: test1 my work log from the year 2018. i intentionally made this verbose to make future admins work easier when troubleshooting. the more keywords error messages etc that are listed in this log the more helpful it will be for the future ose sysadmin. =see also= # maltfield_log # user:maltfield # special:contributions/maltfield =sun apr 08 2018= # i checked again just after midnight the retry appears to have worked pretty great. just 2 archives ..."] [severity [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNmVJDQehGtAwf9cZFxVQAAAAE"]
| |
| </pre>
| |
| ### next I got a false-positive from "959151"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:50:23.449483 2018] [:error] [pid 21758] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "<\\\\?(?!xml)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_40_generic_attacks.conf"] [line "230"] [id "959151"] [rev "2"] [msg "PHP Injection Attack"] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "9"] [accuracy "9"] [tag "OWASP_CRS/WEB_ATTACK/PHP_INJECTION"] [tag "WASCTC/WASC-15"] [tag "OWASP_TOP_10/A6"] [tag "PCI/6.5.2"] [tag "WASCTC/WASC-25"] [tag "OWASP_TOP_10/A1"] [tag "OWASP_AppSensor/CIE4"] [tag "PCI/6.5.2"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNmr7Npwd4Zn01cn3kYrgAAAAg"]
| |
| </pre>
| |
| ### next I got a false-positive from "958976"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:51:47.116321 2018] [:error] [pid 21825] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i)(?:\\\\b(?:f(?:tp_(?:nb_)?f?(?:ge|pu)t|get(?:s?s|c)|scanf|write|open|read)|gz(?:(?:encod|writ)e|compress|open|read)|s(?:ession_start|candir)|read(?:(?:gz)?file|dir)|move_uploaded_file|(?:proc_|bz)open|call_user_func)|\\\\$_(?:(?:pos|ge)t|session))\\\\b" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_40_generic_attacks.conf"] [line "233"] [id "958976"] [rev "2"] [msg "PHP Injection Attack"] [data "Matched Data: $_GET found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I checked again just ..."] [severity "CRITICAL [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNnAwxMy3-QXaGk8S8DZAAAAAg"]
| |
| </pre>
| |
| ### next I got a false-positive from "950007"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:53:13.101976 2018] [:error] [pid 21880] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:\\\\b(?:(?:s(?:ys\\\\.(?:user_(?:(?:t(?:ab(?:_column|le)|rigger)|object|view)s|c(?:onstraints|atalog))|all_tables|tab)|elect\\\\b.{0,40}\\\\b(?:substring|users?|ascii))|m(?:sys(?:(?:queri|ac)e|relationship|column|object)s|ysql\\\\.(db|user))|c(?:onstraint ..." at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "116"] [id "950007"] [rev "2"] [msg "Blind SQL Injection Attack"] [data "Matched Data: substring found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I checked again j..."] [ [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNnWSnNDDq2JC5w2P06-AAAAAk"]
| |
| </pre>
| |
| ### next I got a false-positive from "959070"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:54:27.832922 2018] [:error] [pid 21954] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "\\\\b(?i:having)\\\\b\\\\s+(\\\\d{1,10}|'[^=]{1,10}')\\\\s*?[=<>]|(?i:\\\\bexecute(\\\\s{1,5}[\\\\w\\\\.$]{1,5}\\\\s{0,3})?\\\\()|\\\\bhaving\\\\b ?(?:\\\\d{1,10}|[\\\\'\\"][^=]{1,10}[\\\\'\\"]) ?[=<>]+|(?i:\\\\bcreate\\\\s+?table.{0,20}?\\\\()|(?i:\\\\blike\\\\W*?char\\\\W*?\\\\()|(?i:(?:(select(.* ..." at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "130"] [id "959070"] [rev "2"] [msg "SQL Injection Attack"] [data "Matched Data: from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I checked again just after midnight; the retry appears to have worked pretty great. Just 2..."] [severi [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNno1tQGkBXlULnvmvpaAAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "950908"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:55:32.054195 2018] [:error] [pid 22398] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:\\\\b(?:coalesce\\\\b|root\\\\@))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "140"] [id "950908"] [rev "2"] [msg "SQL Injection Attack."] [data "Matched Data: root@ found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I checked again just ..."] [ver "OWASP_CRS/2.2.9"] [maturity "9"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/SQL_INJECTION"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNn44JlXq26uhw4uNUXCAAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "981250"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:57:24.386324 2018] [:error] [pid 22465] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:(select|;)\\\\s+(?:benchmark|if|sleep)\\\\s*?\\\\(\\\\s*?\\\\(?\\\\s*?\\\\w+))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "215"] [id "981250"] [msg "Detects SQL benchmark and sleep injection attempts including conditional queries"] [data "Matched Data: ; \\x0d\\x0a\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09\\x09 \\x0d\\x0a\\x09 if (beresp found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier w..."] [severity "CRITICAL"] [tag "OWASP_CRS/WEB_ATTACK/SQL_INJECTION"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNoVNnaOuebwb3@ICNKAwAAAAg"]
| |
| </pre>
| |
| ### next I got a false-positive from "981241"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:58:25.610212 2018] [:error] [pid 22500] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:[\\\\s()]case\\\\s*?\\\\()|(?:\\\\)\\\\s*?like\\\\s*?\\\\()|(?:having\\\\s*?[^\\\\s]+\\\\s*?[^\\\\w\\\\s])|(?:if\\\\s?\\\\([\\\\d\\\\w]\\\\s*?[=<>~]))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "217"] [id "981241"] [msg "Detects conditional SQL injection attempts"] [data "Matched Data: having 'fundraiser' found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I check..."] [severity "CRITICAL"] [tag "OWASP_CRS/WEB_ATTACK/SQL_INJECTION"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNokaA9VhPOcZIEe5Ri-AAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "981252"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 14:59:16.861059 2018] [:error] [pid 22538] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:alter\\\\s*?\\\\w+.*?character\\\\s+set\\\\s+\\\\w+)|([\\"'`\\xc2\\xb4\\xe2\\x80\\x99\\xe2\\x80\\x98];\\\\s*?waitfor\\\\s+time\\\\s+[\\"'`\\xc2\\xb4\\xe2\\x80\\x99\\xe2\\x80\\x98])|(?:[\\"'`\\xc2\\xb4\\xe2\\x80\\x99\\xe2\\x80\\x98];.*?:\\\\s*?goto))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "219"] [id "981252"] [msg "Detects MySQL charset switch and MSSQL DoS attempts"] [data "Matched Data: alternative. Other supported op code caches are: mmTurck, WinCache, XCache.\\x0d\\x0a\\x0d\\x0aOpcode caches store the compiled output of PHP scripts, greatly reducing the amount of time needed to run a script multiple times. MediaWiki does not need to be configured to do PHP bytecode caching and will \\x22just work\\x22 once installed and enabled them. \\x0d\\x0a</blockquote>\\x0d\\x0a## but we can't use OPcache for the mediawiki caching (ie: message caching) since it is only a opcode cache, not an ..."] [severity "CRITICAL"] [tag "OWA [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNoxMBiBJrTpPno-1d0JgAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "981256"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:00:06.400039 2018] [:error] [pid 22593] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:merge.*?using\\\\s*?\\\\()|(execute\\\\s*?immediate\\\\s*?[\\"'`\\xc2\\xb4\\xe2\\x80\\x99\\xe2\\x80\\x98])|(?:\\\\W+\\\\d*?\\\\s*?having\\\\s*?[^\\\\s\\\\-])|(?:match\\\\s*?[\\\\w(),+-]+\\\\s*?against\\\\s*?\\\\())" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "221"] [id "981256"] [msg "Detects MATCH AGAINST, MERGE, EXECUTE IMMEDIATE and HAVING injections"] [data "Matched Data: having 7 found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I checked again j..."] [severity "CRITICAL"] [tag "OWASP_CRS/WEB_ [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNo9sgDr4oS7e@QketkjwAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "981249"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:01:37.432935 2018] [:error] [pid 23080] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:[\\"'`\\xc2\\xb4\\xe2\\x80\\x99\\xe2\\x80\\x98]\\\\s+and\\\\s*?=\\\\W)|(?:\\\\(\\\\s*?select\\\\s*?\\\\w+\\\\s*?\\\\()|(?:\\\\*\\\\/from)|(?:\\\\+\\\\s*?\\\\d+\\\\s*?\\\\+\\\\s*?@)|(?:\\\\w[\\"'`\\xc2\\xb4\\xe2\\x80\\x99\\xe2\\x80\\x98]\\\\s*?(?:[-+=|@]+\\\\s*?)+[\\\\d(])|(?:coalesce\\\\s*?\\\\(|@@\\\\w+\\\\s*?[ ..." at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "233"] [id "981249"] [msg "Detects chained SQL injection attempts 2/2"] [data "Matched Data: case my gpg/tar manipulations deletes the originals and I don't want to pay to download them from Glacier again!\\x0d\\x0a<pre>\\x0d\\x0a[root@hetzner2 glacier-cli]# mkdir ../orig\\x0d\\x0a[root@hetzner2 glacier-cli]# cp hetzner1_20170901-052001.fileList.txt.bz2.gpg\\x5c:\\x5c this\\x5c is\\x5c a\\x5c metadata\\x5c file\\x5c showing\\x5c the\\x5c file\\x5c and\\x5c dir\\x5c list\\x5c contents\\x5c of\\x5c the\\x5c archive\\x5c of\\x5c the\\x5c same\\x5c prefix\\x5c name ../orig/\\x0d\\x0a[root@hetzner2 glacier-cli]# c. [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNpUcmL3wtkV-YO1ihZDAAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "981251"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:05:19.598269 2018] [:error] [pid 23148] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:create\\\\s+function\\\\s+\\\\w+\\\\s+returns)|(?:;\\\\s*?(?:select|create|rename|truncate|load|alter|delete|update|insert|desc)\\\\s*?[\\\\[(]?\\\\w{2,}))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "241"] [id "981251"] [msg "Detects MySQL UDF injection and other data/structure manipulation attempts"] [data "Matched Data: ;\\x0d\\x0aCREATE USER found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0aMy work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin.\\x0d\\x0a\\x0d\\x0a=See Also=\\x0d\\x0a# Maltfield_Log\\x0d\\x0a# User:Maltfield\\x0d\\x0a# Special:Contributions/Maltfield\\x0d\\x0a\\x0d\\x0a=Sun Apr 08, 2018=\\x0d\\x0a# I chec..."] [severity "CRITICAL"] [tag "OWASP_CRS/WEB_ATTACK/SQL_INJECTION"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNqL8Ru5tp4rXAvCPJSGgAAAAQ"]
| |
| </pre>
| |
| ### next I got a false-positive from "973336"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:06:20.052955 2018] [:error] [pid 23683] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i)(<script[^>]*>[\\\\s\\\\S]*?<\\\\/script[^>]*>|<script[^>]*>[\\\\s\\\\S]*?<\\\\/script\\\\s\\\\S*[\\\\s\\\\S]|<script[^>]*>[\\\\s\\\\S]*?<\\\\/script[\\\\s]*[\\\\s]|<script[^>]*>[\\\\s\\\\S]*?<\\\\/script|<script[^>]*>[\\\\s\\\\S]*?)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "14"] [id "973336"] [rev "1"] [msg "XSS Filter - Category 1: Script Tag Vector"] [data "Matched Data: <script>\\x0d\\x0a (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){\\x0d\\x0a (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),\\x0d\\x0a m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)\\x0d\\x0a })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');\\x0d\\x0a\\x0d\\x0a ga('create', 'UA-58526017-1', 'auto');\\x0d\\x0a ga('send', 'pageview');\\x0d\\x0a\\x0d\\x0a</script> found within ARGS..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [matu [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNqa27f2r5Ci7rKo19IZgAAAAE"]
| |
| </pre>
| |
| ### next I got a false-positive from "958006"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:07:37.678507 2018] [:error] [pid 23747] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "<body\\\\b.*?\\\\bbackground\\\\b" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "115"] [id "958006"] [rev "2"] [msg "Cross-site Scripting (XSS) Attack"] [data "Matched Data: <body> <h1>not found</h1> <p\\x22. skipping. all renewal attempts failed. the following certs could not be renewed: /etc/letsencrypt/live/openbuildinginstitute.org/fullchain.pem (failure) ------------------------------------------------------------------------------- processing /etc/letsencrypt/renewal/opensourceecology.org.conf ------------------------------------------------------------------------------- ------------------------------------------------------------------------------- proce..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNquZaUhFQqxRu6UgZZrAAAAAE"]
| |
| </pre>
| |
| ### next I got a false-positive from "958049"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:08:36.257521 2018] [:error] [pid 23804] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "\\\\< ?meta\\\\b" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "169"] [id "958049"] [rev "2"] [msg "Cross-site Scripting (XSS) Attack"] [data "Matched Data: <meta found within ARGS:wpTextbox1: test1 my work log from the year 2018. i intentionally made this verbose to make future admin's work easier when troubleshooting. the more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future ose sysadmin. =see also= # maltfield_log # user:maltfield # special:contributions/maltfield =sun apr 08, 2018= # i checked again just after midnight; the retry appears to have worked pretty great. just 2 arc..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNq8w21sJT73GKTS-bkXgAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "958051"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:09:43.732355 2018] [:error] [pid 23864] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "\\\\< ?script\\\\b" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "211"] [id "958051"] [rev "2"] [msg "Cross-site Scripting (XSS) Attack"] [data "Matched Data: <script found within ARGS:wpTextbox1: test1 my work log from the year 2018. i intentionally made this verbose to make future admin's work easier when troubleshooting. the more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future ose sysadmin. =see also= # maltfield_log # user:maltfield # special:contributions/maltfield =sun apr 08, 2018= # i checked again just after midnight; the retry appears to have worked pretty great. just 2 a..."] [severity "CRITICAL"] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNrN-L3I7mwD9Dr@YNrLAAAAAE"]
| |
| </pre>
| |
| ### next I got a false-positive from "973305"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:12:17.682638 2018] [:error] [pid 24338] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(asfunction|javascript|vbscript|data|mocha|livescript):" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "351"] [id "973305"] [rev "2"] [msg "XSS Attack Detected"] [data "Matched Data: data: found within ARGS:wpTextbox1: test1myworklogfromtheyear2018.iintentionallymadethisverbosetomakefutureadmin'sworkeasierwhentroubleshooting.themorekeywords,errormessages,etcthatarelistedinthislog,themorehelpfulitwillbeforthefutureosesysadmin.=seealso=#maltfield_log#user:maltfield#special:contributions/maltfield=sunapr08,2018=#icheckedagainjustaftermidnight;theretryappearstohaveworkedprettygreat.just2archivesfailedonthisrun<pre>hancock%datesatapr722:14:46pdt2018hancock%pwd/ho..."] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNr0e4ceIffaD1NC6xG1AAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "973314"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:13:51.802124 2018] [:error] [pid 24463] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "<!(doctype|entity)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "464"] [id "973314"] [rev "2"] [msg "XSS Attack Detected"] [data "Matched Data: <!doctype found within ARGS:wpTextbox1: test1\\x0d\\x0a\\x0d\\x0amy work log from the year 2018. i intentionally made this verbose to make future admin's work easier when troubleshooting. the more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future ose sysadmin.\\x0d\\x0a\\x0d\\x0a=see also=\\x0d\\x0a# maltfield_log\\x0d\\x0a# user:maltfield\\x0d\\x0a# special:contributions/maltfield\\x0d\\x0a\\x0d\\x0a=sun apr 08, 2018=\\x0d\\x0a# i checked again j..."] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNsL2emkLMGeDo3MiZaagAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "973331"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:14:45.386955 2018] [:error] [pid 24511] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:<script.*?>)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "472"] [id "973331"] [rev "2"] [msg "IE XSS Filters - Attack Detected."] [data "Matched Data: <script> found within ARGS:wpTextbox1: test1 My work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin. =See Also= # Maltfield_Log # User:Maltfield # Special:Contributions/Maltfield =Sun Apr 08, 2018= # I checked again just after midnight; the retry appears to have worked pretty great. Just 2 ..."] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNsZEtxjbRbHLbCcH0csAAAAAU"]
| |
| </pre>
| |
| ### next I got a false-positive from "973330"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:15:55.146051 2018] [:error] [pid 24972] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:<script.*?[ /+\\\\t]*?((src)|(xlink:href)|(href))[ /+\\\\t]*=)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "476"] [id "973330"] [rev "2"] [msg "IE XSS Filters - Attack Detected."] [data "Matched Data: <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src= found within ARGS:wpTextbox1: test1 My work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin..."] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5. [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNsqmLfl9DexEr1vNbnJwAAAAA"]
| |
| </pre>
| |
| ### next I got a false-positive from "973348"; I whitelisted it too
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:16:58.419220 2018] [:error] [pid 25067] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:<META[ /+\\\\t].*?charset[ /+\\\\t]*=)" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_xss_attacks.conf"] [line "492"] [id "973348"] [rev "2"] [msg "IE XSS Filters - Attack Detected."] [data "Matched Data: <meta charset= found within ARGS:wpTextbox1: test1 My work log from the year 2018. I intentionally made this verbose to make future admin's work easier when troubleshooting. The more keywords, error messages, etc that are listed in this log, the more helpful it will be for the future OSE Sysadmin. =See Also= # Maltfield_Log # User:Maltfield # Special:Contributions/Maltfield =Sun Apr 08, 2018= # I checked again just after midnight; the retry appears to have worked pretty great. J..."] [ver "OWASP_CRS/2.2.9"] [maturity "8"] [accuracy "8"] [tag "OWASP_CRS/WEB_ATTACK/XSS"] [tag "WASCTC/WASC-8"] [tag "WASCTC/WASC-22"] [tag "OWASP_TOP_10/A2"] [tag "OWASP_AppSensor/IE1"] [tag "PCI/6.5.1"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNs5z8-aFiK3LAiiWdNlQAAAAU"]
| |
| </pre>
| |
| ### finally, the edit went through. Here's what the apache conf file looks like now:
| |
| <pre>
| |
| [root@hetzner2 conf.d]# cat 00-wiki.opensourceecology.org.conf
| |
| ...
| |
| # disable mod_security with rules as needed
| |
| # (found by logs in: /var/log/httpd/modsec_audit.log)
| |
| <IfModule security2_module>
| |
| SecRuleRemoveById 960015 960024 960904 960015 960017 970901 950109 981172 981231 981245 973338 973306 950901 981317 959072 981257 981243 958030 973300 973304 973335 973333 973316 200004 973347 981319 981240 973301 973344 960335 960020 950120 959073 981244 981248 981253 973334 973332 981242 981246 960915 200003 981173 981318 981260 950911 973302 973324 973317 981255 958057 958056 973327 950018 950001 958008 973329 950907 950910 950005 950006 959151 958976 950007 959070 950908 981250 981241 981252 981256 981249 981251 973336 958006 958049 958051 973305 973314 973331 973330 973348
| |
| | |
| # set the (sans file) POST size limit to 1M (default is 128K)
| |
| SecRequestBodyNoFilesLimit 1000000
| |
| </IfModule>
| |
| ...
| |
| [root@hetzner2 conf.d]#
| |
| </pre>
| |
| ### I tried to edit Marcin's log, but I got another Forbidden; I whitelisted "981276". Then it worked fine
| |
| <pre>
| |
| ==> wiki.opensourceecology.org/error_log <==
| |
| [Sun Apr 15 15:21:28.339555 2018] [:error] [pid 25157] [client 127.0.0.1] ModSecurity: Access denied with code 403 (phase 2). Pattern match "(?i:(?:(union(.*?)select(.*?)from)))" at ARGS:wpTextbox1. [file "/etc/httpd/modsecurity.d/activated_rules/modsecurity_crs_41_sql_injection_attacks.conf"] [line "225"] [id "981276"] [msg "Looking for basic sql injection. Common attack string for mysql, oracle and others."] [data "Matched Data: Unions in St. Joseph]]. How to Find a Good Local Bank. Seed Eco-Home Utilities. How to Glue PVC and ABS.\\x0d\\x0a\\x0d\\x0a=Tue Sep 12, 2017=\\x0d\\x0aOSE HeroX - The Open Source Microfactory Challenge. Tiny Homes.\\x0d\\x0a=Mon Sep 11, 2017=\\x0d\\x0aComparison of CNC Milling to 3D Printing in Metal. Putin Interviews. Track Construction Set. Unauthorized ACH. \\x0d\\x0a\\x0d\\x0a=Sat Sep 9, 2017=\\x0d\\x0a2\\x22 Universal Axis. [[The Monetary System Visually Explain..."] [severity "CRITICAL"] [tag "OWASP_CRS/WEB_ATTACK/SQL_INJECTION"] [hostname "wiki.opensourceecology.org"] [uri "/index.php"] [unique_id "WtNt@GQFLmh587VJM0wsSgAAAAM"]
| |
| </pre>
| |
| ### I'll ask Marcin to try again
| |
| ## (3) Chrome ERR_BLOCKED_BY_XSS_AUDITOR on Preview
| |
| ### I was able to reproduce this issue in Chrome only on Preview of Marcin's log file. A quick google search suggests that it's a bug in Chrome v57 & fixed in v58. I confirmed that I have Chromium v57.0.2987.98.
| |
| ### I asked Marcin to just use Firefox until the bug in Chromium is fixed
| |
| ## (4) thumbnail generation error
| |
| ### so it does appear that the image uploaded https://wiki.opensourceecology.org/images/e/e9/Joehaas.jpg
| |
| ### but the thumbnail generation is throwing an error
| |
| ### for some reason this doesn't happen to the image I uploaded https://wiki.opensourceecology.org/wiki/File:NewFile.jpg
| |
| <pre>
| |
| Error creating thumbnail: Unable to run external programs, proc_open() is disabled. Error code: 1
| |
| </pre>
| |
| ### this appears to be because MediaWiki was configured to use image magick for thumbnail generation. I disabled this in LocalSettings.php, and the page refresh showed the thumbnail instead of the error.
| |
| <pre>
| |
| # we disable using image magick because we intentionally don't grant php exec()
| |
| # or proc_open() permissions
| |
| $wgUseImageMagick = false;
| |
| #$wgImageMagickConvertCommand = "/usr/bin/convert";
| |
| </pre>
| |
| ### I confirmed that the thumbnails all exist as files on the system, so at least these thumbnails don't need to be generated at every page load
| |
| <pre>
| |
| [root@hetzner2 wiki.opensourceecology.org]# ls -lah htdocs/images/thumb/e/e9/Joehaas.jpg/
| |
| total 804K
| |
| drwxr-xr-x 2 apache apache 4.0K Apr 15 16:11 .
| |
| drwxrwx--- 36 apache apache 4.0K Apr 15 16:11 ..
| |
| -rw-r--r-- 1 apache apache 242K Apr 15 16:11 1200px-Joehaas.jpg
| |
| -rw-r--r-- 1 apache apache 6.2K Apr 15 16:11 120px-Joehaas.jpg
| |
| -rw-r--r-- 1 apache apache 383K Apr 15 16:11 1600px-Joehaas.jpg
| |
| -rw-r--r-- 1 apache apache 29K Apr 15 16:11 320px-Joehaas.jpg
| |
| -rw-r--r-- 1 apache apache 127K Apr 15 16:11 800px-Joehaas.jpg
| |
| [root@hetzner2 wiki.opensourceecology.org]#
| |
| </pre>
| |
| # I sent an email with these findings back to Marcin. I'm still waiting for the test plan.
| |
| | |
| =Thr Apr 12, 2018=
| |
| # ok, returning to the wiki. last items I changed was to fix the caching to use APCU (CACHE_ACCEL) instead of the db to prevent the cpPosTime cookie from causing varnish to hit-for-pass, which was rendering our varnish cache useless until the change to APCU. Now that's fixed, I need to test that updating a page's content necessarily includes a call to varnish to purge the cache for the given page.
| |
| # welll...the wiki site is inaccessible because I moved it out of the docroot to reduce our backup sizes of hetzner2 on dreamhost. The content was super stale anyway, so I'll just follow my guide to do a fresh fork of the site
| |
| ## I updated the wiki guide to migrating the wiki to use the new tmp dir for the data dumps @ "/usr/home/osemain/noBackup/tmp/" instead of "/usr/home/osemain/tmp/". This prevents the redundant data from being archived int he daily backup.
| |
| # The data dump of the wiki took 1 hour to complete on hetzner1.
| |
| <pre>
| |
| # DECLARE VARIABLES
| |
| source /usr/home/osemain/backups/backup.settings
| |
| stamp=`date +%Y%m%d`
| |
| backupDir_hetzner1="/usr/home/osemain/noBackup/tmp/backups_for_migration_to_hetzner2/wiki_${stamp}"
| |
| backupFileName_db_hetzner1="mysqldump_wiki.${stamp}.sql.bz2"
| |
| backupFileName_files_hetzner1="wiki_files.${stamp}.tar.gz"
| |
| vhostDir_hetzner1='/usr/www/users/osemain/w'
| |
| dbName_hetzner1='osewiki'
| |
| dbUser_hetzner1="${mysqlUser_wiki}"
| |
| dbPass_hetzner1="${mysqlPass_wiki}"
| |
| | |
| # STEP 1: BACKUP DB
| |
| mkdir -p ${backupDir_hetzner1}/{current,old}
| |
| pushd ${backupDir_hetzner1}/current/
| |
| mv ${backupDir_hetzner1}/current/* ${backupDir_hetzner1}/old/
| |
| time nice mysqldump -u"${dbUser_hetzner1}" -p"${dbPass_hetzner1}" --all-databases --single-transaction | bzip2 -c > ${backupDir_hetzner1}/current/${backupFileName_db_hetzner1}
| |
| | |
| # STEP 2: BACKUP FILES
| |
| time nice tar -czvf ${backupDir_hetzner1}/current/${backupFileName_files_hetzner1} ${vhostDir_hetzner1}
| |
| ...
| |
| /usr/www/users/osemain/w/maintenance/testRunner.ora.sql
| |
| | |
| real 60m16.755s
| |
| user 20m29.404s
| |
| sys 1m55.104s
| |
| osemain@dedi978:~/noBackup/tmp/backups_for_migration_to_hetzner2/wiki_20180412/current$
| |
| </pre>
| |
| # declared variables for this dump, note the timestamp is hardcoded here for future reference & reuse. I also double-checked that mediawiki 1.30.0 is still the latest stable version.
| |
| <pre>
| |
| # DECLARE VARIABLES
| |
| source /root/backups/backup.settings
| |
| #stamp=`date +%Y%m%d`
| |
| stamp="20180412"
| |
| backupDir_hetzner1="/usr/home/osemain/noBackup/tmp/backups_for_migration_to_hetzner2/wiki_${stamp}"
| |
| backupDir_hetzner2="/var/tmp/backups_for_migration_from_hetzner1/wiki_${stamp}"
| |
| backupFileName_db_hetzner1="mysqldump_wiki.${stamp}.sql.bz2"
| |
| backupFileName_files_hetzner1="wiki_files.${stamp}.tar.gz"
| |
| dbName_hetzner1='osewiki'
| |
| dbName_hetzner2='osewiki_db'
| |
| dbUser_hetzner2="osewiki_user"
| |
| dbPass_hetzner2="CHANGEME"
| |
| vhostDir_hetzner2="/var/www/html/wiki.opensourceecology.org"
| |
| docrootDir_hetzner2="${vhostDir_hetzner2}/htdocs"
| |
| newMediawikiSourceUrl='https://releases.wikimedia.org/mediawiki/1.30/mediawiki-1.30.0.tar.gz'
| |
| </pre>
| |
| # fixed an issue with the rsync command in [Mediawiki#migrate_site_from_hetzner1_to_hetzner2]
| |
| # discovered that the htdocs/.htaccess file doesn't actually exist; hmm
| |
| <pre>
| |
| [root@hetzner2 current]# find /var/www/html/wiki.opensourceecology.org/htdocs/ | grep -i htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/images/deleted/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/images/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/archives/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/includes/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/includes/composer/ComposerVendorHtaccessCreator.php
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/languages/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/serialized/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/extensions/Widgets/compiled_templates/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/tests/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/tests/qunit/.htaccess
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/cache/.htaccess
| |
| [root@hetzner2 current]#
| |
| [root@hetzner2 current]# find mediawiki-1.30.0 | grep -i htaccess
| |
| mediawiki-1.30.0/images/.htaccess
| |
| mediawiki-1.30.0/maintenance/archives/.htaccess
| |
| mediawiki-1.30.0/maintenance/.htaccess
| |
| mediawiki-1.30.0/includes/.htaccess
| |
| mediawiki-1.30.0/includes/composer/ComposerVendorHtaccessCreator.php
| |
| mediawiki-1.30.0/languages/.htaccess
| |
| mediawiki-1.30.0/serialized/.htaccess
| |
| mediawiki-1.30.0/tests/.htaccess
| |
| mediawiki-1.30.0/tests/qunit/.htaccess
| |
| mediawiki-1.30.0/cache/.htaccess
| |
| [root@hetzner2 current]#
| |
| </pre>
| |
| # attempting to run the maintenance/update.php script failed
| |
| <pre>
| |
| [root@hetzner2 current]# pushd ${docrootDir_hetzner2}/maintenance
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/maintenance /var/tmp/backups_for_migration_from_hetzner1/wiki_20180412/current /var/www/html
| |
| [root@hetzner2 maintenance]# php update.php
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 715
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 674
| |
| PHP Notice: Undefined index: HTTP_USER_AGENT in /var/www/html/wiki.opensourceecology.org/LocalSettings.php on line 5
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 715
| |
| PHP Notice: Undefined index: SERVER_NAME in /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php on line 1507
| |
| PHP Notice: Undefined index: SERVER_NAME in /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php on line 1507
| |
| MediaWiki 1.30.0 Updater
| |
| | |
| Your composer.lock file is up to date with current dependencies!
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php on line 693
| |
| Set $wgShowExceptionDetails = true; and $wgShowDBErrorBacktrace = true; at the bottom of LocalSettings.php to show detailed debugging information.
| |
| [root@hetzner2 maintenance]#
| |
| </pre>
| |
| # but when I attempt to load the page, I get the following response
| |
| <pre>
| |
| <!DOCTYPE html>
| |
| <html><head><title>Internal error - Open Source Ecology</title><style>body { font-family: sans-serif; margin: 0; padding: 0.5em 2em; }</style></head><body>
| |
| <div class="errorbox">[Ws@Ikz3LkjSWMQ6sfECfMQAAAAo] 2018-04-12 16:25:55: Fatal exception of type MWException</div>
| |
| <!-- Set $wgShowExceptionDetails = true; at the bottom of LocalSettings.php to show detailed debugging information. --></body></html>
| |
| </pre>
| |
| # oh, duh, I left the dbPass_hetzner2 at "CHANGEME" I re-did the DB commands with this fixed, and then tried again
| |
| # but I wanted to first make sure that I deleting the db also deleted the associated users, so before I deleted the db I did a dump of the users & the DBs they have access to
| |
| <pre>
| |
| [root@hetzner2 sites-enabled]# mysql -uroot -p${mysqlPass} mysql -sNe "select Host,Db,User from db;"
| |
| % test
| |
| % test\\_%
| |
| 127.0.0.1 cacti_db cacti_user
| |
| localhost cacti_db cacti_user
| |
| localhost fef_db fef_user
| |
| localhost obi2_db obi2_user
| |
| localhost obi3_db obi3_user
| |
| localhost obi_db obi_user
| |
| localhost obi_staging_db obi_staging_user
| |
| localhost oseforum_db oseforum_user
| |
| localhost osemain_db osemain_user
| |
| localhost osemain_s_db osemain_s_user
| |
| localhost osewiki_db osewiki_user
| |
| localhost oswh_db oswh_user
| |
| localhost piwik_obi_db piwik_obi_user
| |
| localhost seedhome_db seedhome_user
| |
| [root@hetzner2 sites-enabled]#
| |
| </pre>
| |
| # then I dropped the db
| |
| <pre>
| |
| [root@hetzner2 current]# time nice mysql -uroot -p${mysqlPass} -sNe "DROP DATABASE IF EXISTS ${dbName_hetzner2};"
| |
| | |
| real 0m0.165s
| |
| user 0m0.004s
| |
| sys 0m0.000s
| |
| [root@hetzner2 current]#
| |
| </pre>
| |
| # then I checked again
| |
| <pre>
| |
| [root@hetzner2 sites-enabled]# mysql -uroot -p${mysqlPass} mysql -sNe "select Host,Db,User from db;"
| |
| % test
| |
| % test\\_%
| |
| 127.0.0.1 cacti_db cacti_user
| |
| localhost cacti_db cacti_user
| |
| localhost fef_db fef_user
| |
| localhost obi2_db obi2_user
| |
| localhost obi3_db obi3_user
| |
| localhost obi_db obi_user
| |
| localhost obi_staging_db obi_staging_user
| |
| localhost oseforum_db oseforum_user
| |
| localhost osemain_db osemain_user
| |
| localhost osemain_s_db osemain_s_user
| |
| localhost osewiki_db osewiki_user
| |
| localhost oswh_db oswh_user
| |
| localhost piwik_obi_db piwik_obi_user
| |
| localhost seedhome_db seedhome_user
| |
| [root@hetzner2 sites-enabled]#
| |
| </pre>
| |
| # well that sucks; the user is still there! This concurs with their documentation https://dev.mysql.com/doc/refman/5.7/en/drop-database.html
| |
| <blockquote>
| |
| Important: When a database is dropped, privileges granted specifically for the database are not automatically dropped. They must be dropped manually. See Section 13.7.1.4, “GRANT Syntax”.
| |
| </blockquote>
| |
| # so here's the user:
| |
| <pre>
| |
| [root@hetzner2 sites-enabled]# mysql -uroot -p${mysqlPass} mysql -sNe "select Host,Db,User from db where db = 'osewiki_db';"
| |
| localhost osewiki_db osewiki_user
| |
| [root@hetzner2 sites-enabled]#
| |
| </pre>
| |
| # I had issues dropping the user, but the REVOKE worked.
| |
| <pre>
| |
| [root@hetzner2 sites-enabled]# mysql -uroot -p${mysqlPass} mysql -sNe "REVOKE ALL PRIVILEGES ON osewiki_db.* FROM 'osewiki_user'@'localhost'; DROP USER 'osewiki_db'@'localhost'; FLUSH PRIVILEGES;"
| |
| ERROR 1396 (HY000) at line 1: Operation DROP USER failed for 'osewiki_db'@'localhost'
| |
| [root@hetzner2 sites-enabled]# mysql -uroot -p${mysqlPass} mysql -sNe "select Host,Db,User from db where db = 'osewiki_db';"
| |
| [root@hetzner2 sites-enabled]#
| |
| [root@hetzner2 sites-enabled]# mysql -uroot -p${mysqlPass} mysql -sNe "select Host,Db,User from db;"
| |
| % test
| |
| % test\\_%
| |
| 127.0.0.1 cacti_db cacti_user
| |
| localhost cacti_db cacti_user
| |
| localhost fef_db fef_user
| |
| localhost obi2_db obi2_user
| |
| localhost obi3_db obi3_user
| |
| localhost obi_db obi_user
| |
| localhost obi_staging_db obi_staging_user
| |
| localhost oseforum_db oseforum_user
| |
| localhost osemain_db osemain_user
| |
| localhost osemain_s_db osemain_s_user
| |
| localhost oswh_db oswh_user
| |
| localhost piwik_obi_db piwik_obi_user
| |
| localhost seedhome_db seedhome_user
| |
| [root@hetzner2 sites-enabled]#
| |
| </pre>
| |
| # ok, I created the db & user again.
| |
| <pre>
| |
| [root@hetzner2 current]# time nice mysql -uroot -p${mysqlPass} -sNe "CREATE DATABASE ${dbName_hetzner2}; USE ${dbName_hetzner2};"
| |
| | |
| real 0m0.004s
| |
| user 0m0.000s
| |
| sys 0m0.003s
| |
| [root@hetzner2 current]# time nice mysql -uroot -p${mysqlPass} < "db.sql"
| |
| | |
| real 2m18.618s
| |
| user 0m9.201s
| |
| sys 0m0.429s
| |
| [root@hetzner2 current]# time nice mysql -uroot -p${mysqlPass} -sNe "GRANT SELECT, INSERT, UPDATE, DELETE ON ${dbName_hetzner2}.* TO '${dbUser_hetzner2}'@'localhost' IDENTIFIED BY '${dbPass_hetzner2}'; FLUSH PRIVILEGES;"
| |
| | |
| real 0m0.004s
| |
| user 0m0.002s
| |
| sys 0m0.001s
| |
| [root@hetzner2 current]#
| |
| </pre>
| |
| # I ran the maintenance/update.php script again; this time it did something
| |
| <pre>
| |
| [root@hetzner2 current]# pushd ${docrootDir_hetzner2}/maintenance
| |
| /var/www/html/wiki.opensourceecology.org/htdocs/maintenance /var/tmp/backups_for_migration_from_hetzner1/wiki_20180412/current /var/www/html
| |
| [root@hetzner2 maintenance]# php update.php
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 715
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 674
| |
| PHP Notice: Undefined index: HTTP_USER_AGENT in /var/www/html/wiki.opensourceecology.org/LocalSettings.php on line 5
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php on line 715
| |
| PHP Notice: Undefined index: SERVER_NAME in /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php on line 1507
| |
| PHP Notice: Undefined index: SERVER_NAME in /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php on line 1507
| |
| MediaWiki 1.30.0 Updater
| |
| | |
| Your composer.lock file is up to date with current dependencies!
| |
| PHP Warning: ini_set() has been disabled for security reasons in /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php on line 693
| |
| Going to run database updates for osewiki_db-wiki_
| |
| Depending on the size of your database this may take a while!
| |
| Abort with control-c in the next five seconds (skip this countdown with --quick) ... 0
| |
| Turning off Content Handler DB fields for this part of upgrade.
| |
| ...have ipb_id field in ipblocks table.
| |
| ...have ipb_expiry field in ipblocks table.
| |
| ...already have interwiki table
| |
| ...indexes seem up to 20031107 standards.
| |
| ...have rc_type field in recentchanges table.
| |
| ...index new_name_timestamp already set on recentchanges table.
| |
| ...have user_real_name field in user table.
| |
| ...querycache table already exists.
| |
| ...objectcache table already exists.
| |
| ...categorylinks table already exists.
| |
| ...have pagelinks; skipping old links table updates
| |
| ...il_from OK
| |
| ...have rc_ip field in recentchanges table.
| |
| ...index PRIMARY already set on image table.
| |
| ...have rc_id field in recentchanges table.
| |
| ...have rc_patrolled field in recentchanges table.
| |
| ...logging table already exists.
| |
| ...have user_token field in user table.
| |
| ...have wl_notificationtimestamp field in watchlist table.
| |
| ...watchlist talk page rows already present.
| |
| ...user table does not contain user_emailauthenticationtimestamp field.
| |
| ...page table already exists.
| |
| ...have log_params field in logging table.
| |
| ...logging table has correct log_title encoding.
| |
| ...have ar_rev_id field in archive table.
| |
| ...have page_len field in page table.
| |
| ...revision table does not contain inverse_timestamp field.
| |
| ...have rev_text_id field in revision table.
| |
| ...have rev_deleted field in revision table.
| |
| ...have img_width field in image table.
| |
| ...have img_metadata field in image table.
| |
| ...have user_email_token field in user table.
| |
| ...have ar_text_id field in archive table.
| |
| ...page_namespace is already a full int (int(11)).
| |
| ...ar_namespace is already a full int (int(11)).
| |
| ...rc_namespace is already a full int (int(11)).
| |
| ...wl_namespace is already a full int (int(11)).
| |
| ...qc_namespace is already a full int (int(11)).
| |
| ...log_namespace is already a full int (int(11)).
| |
| ...have img_media_type field in image table.
| |
| ...already have pagelinks table.
| |
| ...image table does not contain img_type field.
| |
| ...already have unique user_name index.
| |
| ...user_groups table exists and is in current format.
| |
| ...have ss_total_pages field in site_stats table.
| |
| ...user_newtalk table already exists.
| |
| ...transcache table already exists.
| |
| ...have iw_trans field in interwiki table.
| |
| ...wl_notificationtimestamp is already nullable.
| |
| ...index times already set on logging table.
| |
| ...have ipb_range_start field in ipblocks table.
| |
| ...no page_random rows needed to be set
| |
| ...have user_registration field in user table.
| |
| ...templatelinks table already exists
| |
| ...externallinks table already exists.
| |
| ...job table already exists.
| |
| ...have ss_images field in site_stats table.
| |
| ...langlinks table already exists.
| |
| ...querycache_info table already exists.
| |
| ...filearchive table already exists.
| |
| ...have ipb_anon_only field in ipblocks table.
| |
| ...index rc_ns_usertext already set on recentchanges table.
| |
| ...index rc_user_text already set on recentchanges table.
| |
| ...have user_newpass_time field in user table.
| |
| ...redirect table already exists.
| |
| ...querycachetwo table already exists.
| |
| ...have ipb_enable_autoblock field in ipblocks table.
| |
| ...index pl_namespace on table pagelinks includes field pl_from.
| |
| ...index tl_namespace on table templatelinks includes field tl_from.
| |
| ...index il_to on table imagelinks includes field il_from.
| |
| ...have rc_old_len field in recentchanges table.
| |
| ...have user_editcount field in user table.
| |
| ...page_restrictions table already exists.
| |
| ...have log_id field in logging table.
| |
| ...have rev_parent_id field in revision table.
| |
| ...have pr_id field in page_restrictions table.
| |
| ...have rev_len field in revision table.
| |
| ...have rc_deleted field in recentchanges table.
| |
| ...have log_deleted field in logging table.
| |
| ...have ar_deleted field in archive table.
| |
| ...have ipb_deleted field in ipblocks table.
| |
| ...have fa_deleted field in filearchive table.
| |
| ...have ar_len field in archive table.
| |
| ...have ipb_block_email field in ipblocks table.
| |
| ...index cl_sortkey on table categorylinks includes field cl_from.
| |
| ...have oi_metadata field in oldimage table.
| |
| ...index usertext_timestamp already set on archive table.
| |
| ...index img_usertext_timestamp already set on image table.
| |
| ...index oi_usertext_timestamp already set on oldimage table.
| |
| ...have ar_page_id field in archive table.
| |
| ...have img_sha1 field in image table.
| |
| ...protected_titles table already exists.
| |
| ...have ipb_by_text field in ipblocks table.
| |
| ...page_props table already exists.
| |
| ...updatelog table already exists.
| |
| ...category table already exists.
| |
| ...category table already populated.
| |
| ...have ar_parent_id field in archive table.
| |
| ...have user_last_timestamp field in user_newtalk table.
| |
| ...protected_titles table has correct pt_title encoding.
| |
| ...have ss_active_users field in site_stats table.
| |
| ...ss_active_users user count set...
| |
| ...have ipb_allow_usertalk field in ipblocks table.
| |
| ...change_tag table already exists.
| |
| ...tag_summary table already exists.
| |
| ...valid_tag table already exists.
| |
| ...user_properties table already exists.
| |
| ...log_search table already exists.
| |
| ...have log_user_text field in logging table.
| |
| ...l10n_cache table already exists.
| |
| ...index change_tag_rc_tag already set on change_tag table.
| |
| ...have rd_interwiki field in redirect table.
| |
| ...transcache tc_time already converted.
| |
| ...*_mime_minor fields are already long enough.
| |
| ...iwlinks table already exists.
| |
| ...index iwl_prefix_title_from already set on iwlinks table.
| |
| ...have ul_value field in updatelog table.
| |
| ...have iw_api field in interwiki table.
| |
| ...iwl_prefix key doesn't exist.
| |
| ...have cl_collation field in categorylinks table.
| |
| ...categorylinks up-to-date.
| |
| ...module_deps table already exists.
| |
| ...ar_page_revid key doesn't exist.
| |
| ...index ar_revid already set on archive table.
| |
| ...ll_lang is up-to-date.
| |
| ...user_last_timestamp is already nullable.
| |
| ...index user_email already set on user table.
| |
| ...up_property in table user_properties already modified by patch patch-up_property.sql.
| |
| ...uploadstash table already exists.
| |
| ...user_former_groups table already exists.
| |
| ...index type_action already set on logging table.
| |
| ...have rev_sha1 field in revision table.
| |
| ...batch conversion of user_options: nothing to migrate. done.
| |
| ...user table does not contain user_options field.
| |
| ...have ar_sha1 field in archive table.
| |
| ...index page_redirect_namespace_len already set on page table.
| |
| ...have us_chunk_inx field in uploadstash table.
| |
| ...have job_timestamp field in job table.
| |
| ...index page_user_timestamp already set on revision table.
| |
| ...have ipb_parent_block_id field in ipblocks table.
| |
| ...index ipb_parent_block_id already set on ipblocks table.
| |
| ...category table does not contain cat_hidden field.
| |
| ...have rev_content_format field in revision table.
| |
| ...have rev_content_model field in revision table.
| |
| ...have ar_content_format field in archive table.
| |
| ...have ar_content_model field in archive table.
| |
| ...have page_content_model field in page table.
| |
| Content Handler DB fields should be usable now.
| |
| ...site_stats table does not contain ss_admins field.
| |
| ...recentchanges table does not contain rc_moved_to_title field.
| |
| ...sites table already exists.
| |
| ...have fa_sha1 field in filearchive table.
| |
| ...have job_token field in job table.
| |
| ...have job_attempts field in job table.
| |
| ...have us_props field in uploadstash table.
| |
| ...ug_group in table user_groups already modified by patch patch-ug_group-length-increase-255.sql.
| |
| ...ufg_group in table user_former_groups already modified by patch patch-ufg_group-length-increase-255.sql.
| |
| ...index pp_propname_page already set on page_props table.
| |
| ...index img_media_mime already set on image table.
| |
| ...iwl_prefix_title_from index is already non-UNIQUE.
| |
| ...index iwl_prefix_from_title already set on iwlinks table.
| |
| ...have ar_id field in archive table.
| |
| ...have el_id field in externallinks table.
| |
| ...have rc_source field in recentchanges table.
| |
| ...index log_user_text_type_time already set on logging table.
| |
| ...index log_user_text_time already set on logging table.
| |
| ...have page_links_updated field in page table.
| |
| ...have user_password_expires field in user table.
| |
| ...have pp_sortkey field in page_props table.
| |
| ...recentchanges table does not contain rc_cur_time field.
| |
| ...index wl_user_notificationtimestamp already set on watchlist table.
| |
| ...have page_lang field in page table.
| |
| ...have pl_from_namespace field in pagelinks table.
| |
| ...have tl_from_namespace field in templatelinks table.
| |
| ...have il_from_namespace field in imagelinks table.
| |
| ...img_major_mime in table image already modified by patch patch-img_major_mime-chemical.sql.
| |
| ...oi_major_mime in table oldimage already modified by patch patch-oi_major_mime-chemical.sql.
| |
| ...fa_major_mime in table filearchive already modified by patch patch-fa_major_mime-chemical.sql.
| |
| Extending edit summary lengths (and setting defaults) ...Set $wgShowExceptionDetails = true; and $wgShowDBErrorBacktrace = true; at the bottom of LocalSettings.php to show detailed debugging information.
| |
| [root@hetzner2 maintenance]#
| |
| </pre>
| |
| # but I'm still getting an error when trying to load it
| |
| <pre>
| |
| user@personal:~$ curl -i "https://wiki.opensourceecology.org/"
| |
| HTTP/1.1 500 Internal Server Error
| |
| Server: nginx
| |
| Date: Thu, 12 Apr 2018 17:05:38 GMT
| |
| Content-Type: text/html; charset=utf-8
| |
| Content-Length: 421
| |
| Connection: keep-alive
| |
| X-Content-Type-Options: nosniff
| |
| X-XSS-Protection: 1; mode=block
| |
| X-Varnish: 98392 32786
| |
| Age: 86
| |
| Via: 1.1 varnish-v4
| |
| | |
| <!DOCTYPE html>
| |
| <html><head><title>Internal error - Open Source Ecology</title><style>body { font-family: sans-serif; margin: 0; padding: 0.5em 2em; }</style></head><body>
| |
| <div class="errorbox">[Ws@RjNRic8mf4rYA2bqP2AAAAAg] 2018-04-12 17:04:12: Fatal exception of type MWException</div>
| |
| <!-- Set $wgShowExceptionDetails = true; at the bottom of LocalSettings.php to show detailed debugging information. --></body></html>
| |
| user@personal:~$
| |
| </pre>
| |
| # unfortunately 'wiki-error.log' is not showing any content. So I took the error's advice & added $wgShowExceptionDetails, even though this will leak the error to the user :(
| |
| # silly, there was still no content sent to 'wiki-error.log', but the curl gave me better info
| |
| <pre>
| |
| user@personal:~$ curl -i "https://wiki.opensourceecology.org/"
| |
| HTTP/1.1 500 Internal Server Error
| |
| Server: nginx
| |
| Date: Thu, 12 Apr 2018 17:10:01 GMT
| |
| Content-Type: text/html; charset=utf-8
| |
| Content-Length: 3184
| |
| Connection: keep-alive
| |
| X-Content-Type-Options: nosniff
| |
| X-XSS-Protection: 1; mode=block
| |
| X-Varnish: 196610 65728
| |
| Age: 55
| |
| Via: 1.1 varnish-v4
| |
| | |
| <!DOCTYPE html>
| |
| <html><head><title>Internal error - Open Source Ecology</title><style>body { font-family: sans-serif; margin: 0; padding: 0.5em 2em; }</style></head><body>
| |
| <p>[Ws@SsmBHAg3J1XRaFStUtgAAAAQ] / MWException from line 108 of /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/localisation/LCStoreCDB.php: Unable to open CDB file "/var/www/html/wiki.opensourceecology.org/htdocs/../cache/l10n_cache-en.cdb.tmp.956119238" for write.</p><p>Backtrace:</p><p>#0 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/localisation/LocalisationCache.php(1013): LCStoreCDB->startWrite(string)<br />
| |
| #1 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/localisation/LocalisationCache.php(459): LocalisationCache->recache(string)<br />
| |
| #2 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/localisation/LocalisationCache.php(376): LocalisationCache->initLanguage(string)<br />
| |
| #3 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/localisation/LocalisationCache.php(291): LocalisationCache->loadSubitem(string, string, string)<br />
| |
| #4 /var/www/html/wiki.opensourceecology.org/htdocs/languages/Language.php(2587): LocalisationCache->getSubitem(string, string, string)<br />
| |
| #5 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/MessageCache.php(933): Language->getMessage(string)<br />
| |
| #6 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/MessageCache.php(888): MessageCache->getMessageForLang(LanguageEn, string, boolean, array)<br />
| |
| #7 /var/www/html/wiki.opensourceecology.org/htdocs/includes/cache/MessageCache.php(829): MessageCache->getMessageFromFallbackChain(LanguageEn, string, boolean)<br />
| |
| #8 /var/www/html/wiki.opensourceecology.org/htdocs/includes/Message.php(1275): MessageCache->get(string, boolean, LanguageEn)<br />
| |
| #9 /var/www/html/wiki.opensourceecology.org/htdocs/includes/Message.php(842): Message->fetchMessage()<br />
| |
| #10 /var/www/html/wiki.opensourceecology.org/htdocs/includes/Message.php(934): Message->toString(string)<br />
| |
| #11 /var/www/html/wiki.opensourceecology.org/htdocs/includes/title/MalformedTitleException.php(49): Message->text()<br />
| |
| #12 /var/www/html/wiki.opensourceecology.org/htdocs/includes/title/MediaWikiTitleCodec.php(311): MalformedTitleException->__construct(string, string)<br />
| |
| #13 /var/www/html/wiki.opensourceecology.org/htdocs/includes/Title.php(3526): MediaWikiTitleCodec->splitTitleString(string, integer)<br />
| |
| #14 /var/www/html/wiki.opensourceecology.org/htdocs/includes/Title.php(361): Title->secureAndSplit()<br />
| |
| #15 /var/www/html/wiki.opensourceecology.org/htdocs/includes/MediaWiki.php(84): Title::newFromURL(NULL)<br />
| |
| #16 /var/www/html/wiki.opensourceecology.org/htdocs/includes/MediaWiki.php(140): MediaWiki->parseTitle()<br />
| |
| #17 /var/www/html/wiki.opensourceecology.org/htdocs/includes/MediaWiki.php(767): MediaWiki->getTitle()<br />
| |
| #18 /var/www/html/wiki.opensourceecology.org/htdocs/includes/MediaWiki.php(523): MediaWiki->main()<br />
| |
| #19 /var/www/html/wiki.opensourceecology.org/htdocs/index.php(43): MediaWiki->run()<br />
| |
| #20 {main}</p>
| |
| </body></html>
| |
| user@personal:~$
| |
| </pre>
| |
| # ok, so this is an issue with the 'cache' dir outside the docroot, which is needed for storing interface messages to files per Aaron Schulz's guide. I actually already created this, but the permissions were wrong! I updated the documentation in the migration guide to include creating this dir & set its permissions correctly
| |
| <pre>
| |
| [root@hetzner2 wiki.opensourceecology.org]# ls -lah /var/www/html/wiki.opensourceecology.org/cache
| |
| total 1.1M
| |
| d---r-x--- 2 not-apache apache 4.0K Mar 16 23:55 .
| |
| d---r-x--- 4 not-apache apache 4.0K Apr 12 17:08 ..
| |
| ----r----- 1 not-apache apache 1.1M Mar 16 23:55 l10n_cache-en.cdb
| |
| [root@hetzner2 wiki.opensourceecology.org]# [ -d "${vhostDir_hetzner2}/cache" ] || mkdir "${vhostDir_hetzner2}/cache"
| |
| [root@hetzner2 wiki.opensourceecology.org]# chown -R apache:apache "${vhostDir_hetzner2}/cache"
| |
| [root@hetzner2 wiki.opensourceecology.org]# find "${vhostDir_hetzner2}/cache" -type f -exec chmod 0660 {} \;
| |
| [root@hetzner2 wiki.opensourceecology.org]# find "${vhostDir_hetzner2}/cache" -type d -exec chmod 0770 {} \;
| |
| [root@hetzner2 wiki.opensourceecology.org]#
| |
| [root@hetzner2 wiki.opensourceecology.org]#
| |
| [root@hetzner2 wiki.opensourceecology.org]#
| |
| [root@hetzner2 wiki.opensourceecology.org]# ls -lah /var/www/html/wiki.opensourceecology.org/cache
| |
| total 1.1M
| |
| drwxrwx--- 2 apache apache 4.0K Mar 16 23:55 .
| |
| d---r-x--- 4 not-apache apache 4.0K Apr 12 17:08 ..
| |
| -rw-rw---- 1 apache apache 1.1M Mar 16 23:55 l10n_cache-en.cdb
| |
| [root@hetzner2 wiki.opensourceecology.org]#
| |
| </pre>
| |
| # I manually purged the varnish cache, reloaded, and it worked!
| |
| <pre>
| |
| [root@hetzner2 wiki.opensourceecology.org]# varnishadm 'ban req.url ~ "."'
| |
| | |
| [root@hetzner2 wiki.opensourceecology.org]#
| |
| user@personal:~$ curl -i "https://wiki.opensourceecology.org/"
| |
| HTTP/1.1 301 Moved Permanently
| |
| Server: nginx
| |
| Date: Thu, 12 Apr 2018 17:28:07 GMT
| |
| Content-Type: text/html; charset=utf-8
| |
| Content-Length: 0
| |
| Connection: keep-alive
| |
| X-Content-Type-Options: nosniff
| |
| Vary: Accept-Encoding,Cookie
| |
| Cache-Control: s-maxage=1200, must-revalidate, max-age=0
| |
| Last-Modified: Thu, 12 Apr 2018 17:28:07 GMT
| |
| Location: https://wiki.opensourceecology.org/wiki/Main_Page
| |
| X-XSS-Protection: 1; mode=block
| |
| X-Varnish: 625
| |
| Age: 0
| |
| Via: 1.1 varnish-v4
| |
| Strict-Transport-Security: max-age=15552001
| |
| Public-Key-Pins: pin-sha256="UbSbHFsFhuCrSv9GNsqnGv4CbaVh5UV5/zzgjLgHh9c="; pin-sha256="YLh1dUR9y6Kja30RrAn7JKnbQG/uEtLMkBgFF2Fuihg="; pin-sha256="C5+lpZ7tcVwmwQIMcRtPbsQtWLABXhQzejna0wHFr8M="; pin-sha256="Vjs8r4z+80wjNcr1YKepWQboSIRi63WsWXhIMN+eWys="; pin-sha256="lCppFqbkrlJ3EcVFAkeip0+44VaoJUymbnOaEUk7tEU="; pin-sha256="K87oWBWM9UZfyddvDfoxL+8lpNyoUB2ptGtn0fv6G2Q="; pin-sha256="Y9mvm0exBk1JoQ57f9Vm28jKo5lFm/woKcVxrYxu80o="; pin-sha256="EGn6R6CqT4z3ERscrqNl7q7RCzJmDe9uBhS/rnCHU="; pin-sha256="NIdnza073SiyuN1TUa7DDGjOxc1p0nbfOCfbxPWAZGQ="; pin-sha256="fNZ8JI9p2D/C+bsB3LH3rWejY9BGBDeW0JhMOiMfa7A="; pin-sha256="oyD01TTXvpfBro3QSZc1vIlcMjrdLTiL/M9mLCPX+Zo="; pin-sha256="0cRTd+vc1hjNFlHcLgLCHXUeWqn80bNDH/bs9qMTSPo="; pin-sha256="MDhNnV1cmaPdDDONbiVionUHH2QIf2aHJwq/lshMWfA="; pin-sha256="OIZP7FgTBf7hUpWHIA7OaPVO2WrsGzTl9vdOHLPZmJU="; max-age=3600; includeSubDomains; report-uri="http:opensourceecology.org/hpkp-report"
| |
| | |
| user@personal:~$ curl -i "https://wiki.opensourceecology.org/wiki/Main_Page"
| |
| ...
| |
| </body>
| |
| </html>
| |
| user@personal:~$
| |
| </pre>
| |
| # ...but when I went to login, I got an error:
| |
| <pre>
| |
| [Ws@XjG9Z0eot@07Oyosq6gAAAAc] 2018-04-12 17:29:48: Fatal exception of type "Wikimedia\Rdbms\DBQueryError"
| |
| </pre>
| |
| # I apparently encountered this in the past, but the issue was that I needed to fix an ini_set that I mangled with a sed & re-run the maintenance scripts [Maltfield_log_2018#Tue_Feb_27.2C_2018]
| |
| # I gave the maintenance scripts another 2x taps, cleared the varnish cache, and tried to login again; I got the same error
| |
| <pre>
| |
| [Ws@Zy9Ric8mf4rYA2bqQDgAAAAg] cu-04-12 17:39:23: Fatal exception of type "Wikimedia\Rdbms\DBQueryError"
| |
| </pre>
| |
| # I noticed that the 'wiki-error.log' file was populating with output from the update.php run, and it looks like the issue was that the db user doesn't have the ALTER permission to the db
| |
| <pre>
| |
| [error] [75060eb56a79742c1c46e7a5] [no req] ErrorException from line 1507 of /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php: PHP Notice: Undefined index: SERVER_NAME
| |
| #0 /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php(1507): MWExceptionHandler::handleError(integer, string, string, integer, array)
| |
| #1 /var/www/html/wiki.opensourceecology.org/htdocs/includes/db/MWLBFactory.php(60): wfHostname()
| |
| #2 /var/www/html/wiki.opensourceecology.org/htdocs/includes/ServiceWiring.php(54): MWLBFactory::applyDefaultConfig(array, GlobalVarConfig, ConfiguredReadOnlyMode)
| |
| #3 [internal function]: MediaWiki\Services\ServiceContainer->{closure}(MediaWiki\MediaWikiServices)
| |
| #4 /var/www/html/wiki.opensourceecology.org/htdocs/includes/services/ServiceContainer.php(361): call_user_func_array(Closure, array)
| |
| #5 /var/www/html/wiki.opensourceecology.org/htdocs/includes/services/ServiceContainer.php(344): MediaWiki\Services\ServiceContainer->createService(string)
| |
| #6 /var/www/html/wiki.opensourceecology.org/htdocs/includes/MediaWikiServices.php(503): MediaWiki\Services\ServiceContainer->getService(string)
| |
| #7 /var/www/html/wiki.opensourceecology.org/htdocs/includes/Setup.php(664): MediaWiki\MediaWikiServices->getDBLoadBalancerFactory()
| |
| #8 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/doMaintenance.php(79): require_once(string)
| |
| #9 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/update.php(249): require_once(string)
| |
| #10 {main}
| |
| IP: 127.0.0.1
| |
| Start command line script update.php
| |
| [caches] cluster: APCBagOStuff, WAN: mediawiki-main-default, stash: db-replicated, message: APCBagOStuff, session: APCBagOStuff
| |
| [caches] LocalisationCache: using store LCStoreNull
| |
| [error] [75060eb56a79742c1c46e7a5] [no req] ErrorException from line 1507 of /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php: PHP Notice: Undefined index: SERVER_NAME
| |
| #0 /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php(1507): MWExceptionHandler::handleError(integer, string, string, integer, array)
| |
| #1 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php(565): wfHostname()
| |
| #2 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/doMaintenance.php(89): Maintenance->setAgentAndTriggers()
| |
| #3 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/update.php(249): require_once(string)
| |
| #4 {main}
| |
| [DBReplication] Wikimedia\Rdbms\LBFactory::getChronologyProtector: using request info {
| |
| "IPAddress": "127.0.0.1",
| |
| "UserAgent": false,
| |
| "ChronologyProtection": false
| |
| }
| |
| [DBConnection] Wikimedia\Rdbms\LoadBalancer::openConnection: calling initLB() before first connection.
| |
| [error] [75060eb56a79742c1c46e7a5] [no req] ErrorException from line 693 of /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php: PHP Warning: ini_set() has been disabled for security reasons
| |
| #0 [internal function]: MWExceptionHandler::handleError(integer, string, string, integer, array)
| |
| #1 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php(693): ini_set(string, string)
| |
| #2 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/DatabaseMysqlBase.php(129): Wikimedia\Rdbms\Database->installErrorHandler()
| |
| #3 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php(285): Wikimedia\Rdbms\DatabaseMysqlBase->open(string, string, string, string)
| |
| #4 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/DatabaseMysqlBase.php(102): Wikimedia\Rdbms\Database->__construct(array)
| |
| #5 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php(415): Wikimedia\Rdbms\DatabaseMysqlBase->__construct(array)
| |
| #6 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/loadbalancer/LoadBalancer.php(985): Wikimedia\Rdbms\Database::factory(string, array)
| |
| #7 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/loadbalancer/LoadBalancer.php(801): Wikimedia\Rdbms\LoadBalancer->reallyOpenConnection(array, boolean)
| |
| #8 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/loadbalancer/LoadBalancer.php(667): Wikimedia\Rdbms\LoadBalancer->openConnection(integer, boolean, integer)
| |
| #9 /var/www/html/wiki.opensourceecology.org/htdocs/includes/GlobalFunctions.php(2858): Wikimedia\Rdbms\LoadBalancer->getConnection(integer, array, boolean)
| |
| #10 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/Maintenance.php(1253): wfGetDB(integer, array, boolean)
| |
| #11 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/update.php(146): Maintenance->getDB(integer)
| |
| #12 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/doMaintenance.php(92): UpdateMediaWiki->execute()
| |
| #13 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/update.php(249): require_once(string)
| |
| #14 {main}
| |
| [DBConnection] Connected to database 0 at 'localhost'.
| |
| [DBQuery] SQL ERROR: ALTER command denied to user 'osewiki_user'@'localhost' for table 'wiki_revision' (localhost)
| |
| | |
| [exception] [75060eb56a79742c1c46e7a5] [no req] Wikimedia\Rdbms\DBQueryError from line 1149 of /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php: A database query error has occurred. Did you forget to run your application's database schema updater after upgrading?
| |
| Query: ALTER TABLE `wiki_revision` MODIFY rev_comment varbinary(767) NOT NULL default ''
| |
| | |
| Function: Wikimedia\Rdbms\Database::sourceFile( /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/archives/patch-editsummary-length.sql )
| |
| Error: 1142 ALTER command denied to user 'osewiki_user'@'localhost' for table 'wiki_revision' (localhost)
| |
| | |
| #0 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php(979): Wikimedia\Rdbms\Database->reportQueryError(string, integer, string, string, boolean)
| |
| #1 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php(3325): Wikimedia\Rdbms\Database->query(string, string)
| |
| #2 /var/www/html/wiki.opensourceecology.org/htdocs/includes/libs/rdbms/database/Database.php(3274): Wikimedia\Rdbms\Database->sourceStream(unknown type, NULL, NULL, string, NULL)
| |
| #3 /var/www/html/wiki.opensourceecology.org/htdocs/includes/installer/DatabaseUpdater.php(673): Wikimedia\Rdbms\Database->sourceFile(string)
| |
| #4 /var/www/html/wiki.opensourceecology.org/htdocs/includes/installer/MysqlUpdater.php(1194): DatabaseUpdater->applyPatch(string, boolean, string)
| |
| #5 [internal function]: MysqlUpdater->doExtendCommentLengths()
| |
| #6 /var/www/html/wiki.opensourceecology.org/htdocs/includes/installer/DatabaseUpdater.php(472): call_user_func_array(array, array)
| |
| #7 /var/www/html/wiki.opensourceecology.org/htdocs/includes/installer/DatabaseUpdater.php(436): DatabaseUpdater->runUpdates(array, boolean)
| |
| #8 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/update.php(204): DatabaseUpdater->doUpdates(array)
| |
| #9 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/doMaintenance.php(92): UpdateMediaWiki->execute()
| |
| #10 /var/www/html/wiki.opensourceecology.org/htdocs/maintenance/update.php(249): require_once(string)
| |
| #11 {main}
| |
| [DBConnection] Closing connection to database 'localhost'.
| |
| </pre>
| |
| # this issue was somewhat anticipated as it was described in the Mediawiki Security docs https://www.mediawiki.org/wiki/Manual:Security#General_MySQL_and_MariaDB_recommendations
| |
| # I created a new db "superuser" & granted it all permissions. note that I had to use "osewiki_superusr" instead of "osewiki_superuser" due to string lenth limits of the username *sigh*
| |
| <pre>
| |
| [root@hetzner2 maintenance]# dbSuperUser_hetzner2="osewiki_superuser"
| |
| [root@hetzner2 maintenance]# dbSuperPass_hetzner2="CHANGEME"
| |
| [root@hetzner2 maintenance]# time nice mysql -uroot -p${mysqlPass} -sNe "GRANT ALL ON ${dbName_hetzner2}.* TO '${dbSuperUser_hetzner2}'@'localhost' IDENTIFIED BY '${dbSuperPass_hetzner2}'; FLUSH PRIVILEGES;"
| |
| ERROR 1470 (HY000) at line 1: String 'osewiki_superuser' is too long for user name (should be no longer than 16)
| |
| | |
| real 0m0.004s
| |
| user 0m0.002s
| |
| sys 0m0.002s
| |
| [root@hetzner2 maintenance]#
| |
| [root@hetzner2 maintenance]# dbSuperUser_hetzner2="osewiki_superusr"
| |
| [root@hetzner2 maintenance]# time nice mysql -uroot -p${mysqlPass} -sNe "GRANT ALL ON ${dbName_hetzner2}.* TO '${dbSuperUser_hetzner2}'@'localhost' IDENTIFIED BY '${dbSuperPass_hetzner2}'; FLUSH PRIVILEGES;"
| |
| | |
| real 0m0.004s
| |
| user 0m0.000s
| |
| sys 0m0.003s
| |
| [root@hetzner2 maintenance]#
| |
| </pre>
| |
| # I ran the maintenance script again, giving it a distinct db user's credentials via arguments. It worked.
| |
| <pre>
| |
| [root@hetzner2 maintenance]# php update.php --dbuser "${dbSuperUser_hetzner2}" --dbpass "${dbSuperPass_hetzner2}"
| |
| ...
| |
| Attempted to insert 685 IP revisions, 685 actually done.
| |
| Set the local repo temp zone container to be private.
| |
| Purging caches...done.
| |
| | |
| Done in 26 s.
| |
| [root@hetzner2 maintenance]#
| |
| </pre>
| |
| # I refreshed the login attempt, and I logged-in successfully. I also updated the documentation to include these arguments in the call to execute update.php.
| |
| # made an edit to a page, and it appeared to work fine. Then I went to a distinct ephemeral browser, loaded the page, and the edit didn't show.
| |
| # I got a command to check for varnish purges; I tested it by triggering a purge from the wordpress page on osemain
| |
| <pre>
| |
| [root@hetzner2 ~]# varnishlog | grep -EC20 "ReqMethod\s*PURGE"
| |
| - Timestamp Process: 1523562967.680578 0.341190 0.000040
| |
| - Debug "RES_MODE 4"
| |
| - RespHeader Connection: close
| |
| - Timestamp Resp: 1523562967.691852 0.352464 0.011274
| |
| - Debug "XXX REF 2"
| |
| - ReqAcct 339 0 339 393 91587 91980
| |
| - End
| |
| | |
| * << Session >> 99573
| |
| - Begin sess 0 HTTP/1
| |
| - SessOpen 127.0.0.1 57242 127.0.0.1:6081 127.0.0.1 6081 1523562967.339361 12
| |
| - Link req 99574 rxreq
| |
| - SessClose RESP_CLOSE 0.353
| |
| - End
| |
| | |
| * << Request >> 329175
| |
| - Begin req 329174 rxreq
| |
| - Timestamp Start: 1523562967.712793 0.000000 0.000000
| |
| - Timestamp Req: 1523562967.712793 0.000000 0.000000
| |
| - ReqStart 127.0.0.1 57244
| |
| - ReqMethod PURGE
| |
| - ReqURL /.*
| |
| - ReqProtocol HTTP/1.1
| |
| - ReqHeader User-Agent: WordPress/4.9.4; https://www.opensourceecology.org
| |
| - ReqHeader Accept-Encoding: deflate;q=1.0, compress;q=0.5, gzip;q=0.5
| |
| - ReqHeader host: www.opensourceecology.org
| |
| - ReqHeader X-VC-Purge-Method: regex
| |
| - ReqHeader X-VC-Purge-Host: www.opensourceecology.org
| |
| - ReqHeader Connection: Close
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1
| |
| - VCL_call RECV
| |
| - ReqUnset X-Forwarded-For: 127.0.0.1
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1, 127.0.0.1
| |
| - ReqHeader X-VC-My-Purge-Key: JOaSAn72IJzrykJp1pfEWaECvUU8KvxZJnxSue3repId3qV8wJOHexjtuhi9r6Wv4FH9y9eFfiMjXX6hvxRrVOEWr2IaBVZMZ7ToEz8nLFdRyjyMkUGMANd6MHOzxiTJ
| |
| - ReqHeader X-VC-Purge-Key-Auth: false
| |
| - VCL_acl MATCH purge "localhost"
| |
| - Debug "VCL_error(200, Purged /.* www.opensourceecology.org)"
| |
| - VCL_return synth
| |
| - ReqUnset Accept-Encoding: deflate;q=1.0, compress;q=0.5, gzip;q=0.5
| |
| - ReqHeader Accept-Encoding: gzip
| |
| - VCL_call HASH
| |
| </pre>
| |
| # I just noticed the "X-VC-My-Purge-Key", in my logs, which matches /etc/varnish/secret. Well, I already logged it on the public internet so it's not a secret anymore. Our purges *should* work by ACL limited to IP address, so I went ahead and changed the contents of /etc/varnish/secret to a new, random 128 character string & restarted varnish
| |
| # I went to update this purge key in wordpress, but I didn't see it set anywhere
| |
| #I updated a minor change to the workshops page & saved it. My browser showed the change, but a distinct/epehermal browser refresh did not show it. I triggered a purge of the page from the wp wui, and I got a log in my grep to pop-up. I refreshed it in the distinct/ephemeral browser, and the change was now visible. That confirms that purging still works despite the change the the purge key. Note that the output below still shows the old purge key. *shrug*
| |
| <pre>
| |
| * << Request >> 458860
| |
| - Begin req 458859 rxreq
| |
| - Timestamp Start: 1523563588.895393 0.000000 0.000000
| |
| - Timestamp Req: 1523563588.895393 0.000000 0.000000
| |
| - ReqStart 127.0.0.1 59502
| |
| - ReqMethod PURGE
| |
| - ReqURL /
| |
| - ReqProtocol HTTP/1.1
| |
| - ReqHeader User-Agent: WordPress/4.9.4; https://www.opensourceecology.org
| |
| - ReqHeader Accept-Encoding: deflate;q=1.0, compress;q=0.5, gzip;q=0.5
| |
| - ReqHeader host: www.opensourceecology.org
| |
| - ReqHeader X-VC-Purge-Method: default
| |
| - ReqHeader X-VC-Purge-Host: www.opensourceecology.org
| |
| - ReqHeader Connection: Close
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1
| |
| - VCL_call RECV
| |
| - ReqUnset X-Forwarded-For: 127.0.0.1
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1, 127.0.0.1
| |
| - ReqHeader X-VC-My-Purge-Key: JOaSAn72IJzrykJp1pfEWaECvUU8KvxZJnxSue3repId3qV8wJOHexjtuhi9r6Wv4FH9y9eFfiMjXX6hvxRrVOEWr2IaBVZMZ7ToEz8nLFdRyjyMkUGMANd6MHOzxiTJ
| |
| - ReqHeader X-VC-Purge-Key-Auth: false
| |
| - VCL_acl MATCH purge "localhost"
| |
| - ReqURL /
| |
| - Debug "VCL_error(200, Purged / www.opensourceecology.org)"
| |
| - VCL_return synth
| |
| - ReqUnset Accept-Encoding: deflate;q=1.0, compress;q=0.5, gzip;q=0.5
| |
| - ReqHeader Accept-Encoding: gzip
| |
| </pre>
| |
| # I checked my varnish config for the wiki site, and I realized that I started at the vcl_recv() function. But, it does get defined in conf/acl.vcl, which is included by the main vcl file = default.vcl
| |
| <pre>
| |
| [root@hetzner2 varnish]# cat default.vcl
| |
| ################################################################################
| |
| # File: default.vcl
| |
| # Version: 0.1
| |
| # Purpose: Main config file for varnish cache. Note that it's intentionally
| |
| # mostly bare to allow robust vhost-specific logic. Please see this
| |
| # for more info:
| |
| # * https://www.getpagespeed.com/server-setup/varnish/varnish-virtual-hosts
| |
| # Author: Michael Altfield <michael@opensourceecology.org>
| |
| # Created: 2017-11-12
| |
| # Updated: 2017-11-12
| |
| ################################################################################
| |
| | |
| vcl 4.0;
| |
| | |
| ############
| |
| # INCLUDES #
| |
| ############
| |
| #
| |
| import std;
| |
| | |
| include "conf/acl.vcl";
| |
| include "lib/purge.vcl";
| |
| | |
| | |
| include "all-vhosts.vcl";
| |
| include "catch-all.vcl";
| |
| [root@hetzner2 varnish]# cat conf/acl.vcl
| |
| acl purge {
| |
| "localhost";
| |
| "127.0.0.1";
| |
| }
| |
| [root@hetzner2 varnish]#
| |
| </pre>
| |
| # I tested an edit again, but no output came from my grep of the varnishlog for purge requests. hmm.
| |
| # I dug down to a tcpdump, here's a positive coming from the purge in the worpdress wui from osemain
| |
| <pre>
| |
| [root@hetzner2 varnish]# tcpdump -i lo -nX dst port 6081
| |
| ...
| |
| 22:55:00.501715 IP 127.0.0.1.53826 > 127.0.0.1.6081: Flags [P.], seq 0:1252, ack 1, win 342, options [nop,nop,TS val 3519411272 ecr 3519411272], length 1252
| |
| 0x0000: 4500 0518 ddd4 4000 4006 5a09 7f00 0001 E.....@.@.Z.....
| |
| 0x0010: 7f00 0001 d242 17c1 9300 299b 26a2 4143 .....B....).&.AC
| |
| 0x0020: 8018 0156 030d 0000 0101 080a d1c5 f448 ...V...........H
| |
| 0x0030: d1c5 f448 4745 5420 2f77 702d 6164 6d69 ...HGET./wp-admi
| |
| 0x0040: 6e2f 3f70 7572 6765 5f76 6172 6e69 7368 n/?purge_varnish
| |
| 0x0050: 5f63 6163 6865 3d31 265f 7770 6e6f 6e63 _cache=1&_wpnonc
| |
| 0x0060: 653d 6661 3862 6565 6264 6566 2048 5454 e=fa8beebdef.HTT
| |
| 0x0070: 502f 312e 300d 0a58 2d52 6561 6c2d 4950 P/1.0..X-Real-IP
| |
| 0x0080: 3a20 3736 2e39 372e 3232 332e 3138 350d :.76.97.223.185.
| |
| 0x0090: 0a58 2d46 6f72 7761 7264 6564 2d46 6f72 .X-Forwarded-For
| |
| 0x00a0: 3a20 3736 2e39 372e 3232 332e 3138 350d :.76.97.223.185.
| |
| 0x00b0: 0a58 2d46 6f72 7761 7264 6564 2d50 726f .X-Forwarded-Pro
| |
| 0x00c0: 746f 3a20 6874 7470 730d 0a58 2d46 6f72 to:.https..X-For
| |
| 0x00d0: 7761 7264 6564 2d50 6f72 743a 2034 3433 warded-Port:.443
| |
| 0x00e0: 0d0a 486f 7374 3a20 7777 772e 6f70 656e ..Host:.www.open
| |
| 0x00f0: 736f 7572 6365 6563 6f6c 6f67 792e 6f72 sourceecology.or
| |
| 0x0100: 670d 0a43 6f6e 6e65 6374 696f 6e3a 2063 g..Connection:.c
| |
| 0x0110: 6c6f 7365 0d0a 5573 6572 2d41 6765 6e74 lose..User-Agent
| |
| ...
| |
| </pre>
| |
| # I did a page update in mediawiki while running this tcpdump; I saw the page update come through varnish, but there was no purge.
| |
| <pre>
| |
| [root@hetzner2 varnish]# tcpdump -i lo -nX dst port 6081
| |
| ...
| |
| 23:05:17.973341 IP 127.0.0.1.54964 > 127.0.0.1.6081: Flags [P.], seq 0:4047, ack 1, win 342, options [nop,nop,TS val 3520028743 ecr 3520028743], length 4047
| |
| 0x0000: 4500 1003 96c7 4000 4006 962b 7f00 0001 E.....@.@..+....
| |
| 0x0010: 7f00 0001 d6b4 17c1 fba0 f683 3a84 17b3 ............:...
| |
| 0x0020: 8018 0156 0df8 0000 0101 080a d1cf 6047 ...V..........`G
| |
| 0x0030: d1cf 6047 504f 5354 202f 696e 6465 782e ..`GPOST./index.
| |
| 0x0040: 7068 703f 7469 746c 653d 5573 6572 3a4d php?title=User:M
| |
| 0x0050: 616c 7466 6965 6c64 2661 6374 696f 6e3d altfield&action=
| |
| 0x0060: 7375 626d 6974 2048 5454 502f 312e 300d submit.HTTP/1.0.
| |
| 0x0070: 0a58 2d52 6561 6c2d 4950 3a20 3736 2e39 .X-Real-IP:.76.9
| |
| 0x0080: 372e 3232 332e 3138 350d 0a58 2d46 6f72 7.223.185..X-For
| |
| 0x0090: 7761 7264 6564 2d46 6f72 3a20 3736 2e39 warded-For:.76.9
| |
| 0x00a0: 372e 3232 332e 3138 350d 0a58 2d46 6f72 7.223.185..X-For
| |
| 0x00b0: 7761 7264 6564 2d50 726f 746f 3a20 6874 warded-Proto:.ht
| |
| 0x00c0: 7470 730d 0a58 2d46 6f72 7761 7264 6564 tps..X-Forwarded
| |
| 0x00d0: 2d50 6f72 743a 2034 3433 0d0a 486f 7374 -Port:.443..Host
| |
| 0x00e0: 3a20 7769 6b69 2e6f 7065 6e73 6f75 7263 :.wiki.opensourc
| |
| 0x00f0: 6565 636f 6c6f 6779 2e6f 7267 0d0a 436f eecology.org..Co
| |
| ...
| |
| </pre>
| |
| # I learned that Mediawiki defaults to sending the purge requests over port 80. I changed that to the default varnish port that we're using = 6081 by setting this line in LocalSettings.php
| |
| <pre>
| |
| $wgUseSquid = true;
| |
| $wgSquidServers = array( '127.0.0.1:6081');
| |
| $wgUsePrivateIPs = true;
| |
| </pre>
| |
| # then I did a page update in MediaWiki, and confirmed the PURGE came in via `varnishlog`
| |
| <pre>
| |
| [root@hetzner2 varnish]# tcpdump -i lo -nX dst port 6081
| |
| ...
| |
| * << Request >> 331532
| |
| - Begin req 331530 rxreq
| |
| - Timestamp Start: 1523574861.741242 0.000000 0.000000
| |
| - Timestamp Req: 1523574861.741242 0.000000 0.000000
| |
| - ReqStart 127.0.0.1 55936
| |
| - ReqMethod PURGE
| |
| - ReqURL /index.php?title=User:Maltfield&action=history
| |
| - ReqProtocol HTTP/1.1
| |
| - ReqHeader Host: wiki.opensourceecology.org
| |
| - ReqHeader Connection: Keep-Alive
| |
| - ReqHeader Proxy-Connection: Keep-Alive
| |
| - ReqHeader User-Agent: MediaWiki/1.30.0 SquidPurgeClient
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1
| |
| - VCL_call RECV
| |
| - ReqUnset X-Forwarded-For: 127.0.0.1
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1
| |
| - VCL_acl MATCH purge "localhost"
| |
| - VCL_return purge
| |
| - VCL_call HASH
| |
| - VCL_return lookup
| |
| - VCL_call PURGE
| |
| - Debug "VCL_error(200, Purged)"
| |
| - VCL_return synth
| |
| - Timestamp Process: 1523574861.741277 0.000035 0.000035
| |
| - RespHeader Date: Thu, 12 Apr 2018 23:14:21 GMT
| |
| - RespHeader Server: Varnish
| |
| </pre>
| |
| # I loaded a wiki page in an ephemeral browser, updated it in my other logged-in browser, then reloaded it back in the distinct/ephemeral brower. I confirmed that the change came through in the distinct/ephemeral browser. So purging is working!
| |
| # I launched a fresh disposable vm ephemeral browser, loaded the page, got a miss, loaded the page again, got a hit. So caching is working for the Main_Page at least
| |
| # unfortunately, the page load included several GET requests that were not HITs
| |
| ## /load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector
| |
| ## /load.php?debug=false&lang=en&modules=mediawiki.legacy.commonPrint%2Cshared%7Cmediawiki.sectionAnchor%7Cmediawiki.skinning.interface%7Cskins.vector.styles&only=styles&skin=vector
| |
| ## /load.php?debug=false&lang=en&modules=startup&only=scripts&skin=vector
| |
| ## /load.php?debug=false&lang=en&modules=jquery%2Cmediawiki&only=scripts&skin=vector&version=1ubqa9r
| |
| ## /load.php?debug=false&lang=en&modules=jquery.accessKeyLabel%2CcheckboxShiftClick%2Cclient%2Ccookie%2CgetAttrs%2ChighlightText%2Cmw-jump%2Csuggestions%2CtabIndex%2Cthrottle-debounce%7Cmediawiki.RegExp%2Capi%2Ccookie%2Cnotify%2CsearchSuggest%2Cstorage%2Cto
| |
| ## //load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector
| |
| # I did another refresh I caught the MISSES
| |
| ## /load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector
| |
| ## /load.php?debug=false&lang=en&modules=mediawiki.legacy.commonPrint%2Cshared%7Cmediawiki.sectionAnchor%7Cmediawiki.skinning.interface%7Cskins.vector.styles&only=styles&skin=vector
| |
| ## /load.php?debug=false&lang=en&modules=startup&only=scripts&skin=vector
| |
| ## /load.php?debug=false&lang=en&modules=jquery.accessKeyLabel%2CcheckboxShiftClick%2Cclient%2Ccookie%2CgetAttrs%2ChighlightText%2Cmw-jump%2Csuggestions%2CtabIndex%2Cthrottle-debounce%7Cmediawiki.RegExp%2Capi%2Ccookie%2Cnotify%2CsearchSuggest%2Cstorage%2Cto
| |
| # so many of those are the same; let's isolate to the first one = "/load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector"
| |
| # umm, but a call from curl yielded a HIT
| |
| <pre>
| |
| user@personal:~$ curl -i "https://wiki.opensourceecology.org/load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector"
| |
| HTTP/1.1 200 OK
| |
| Server: nginx
| |
| Date: Fri, 13 Apr 2018 00:16:49 GMT
| |
| Content-Type: text/css; charset=utf-8
| |
| Content-Length: 921
| |
| Connection: keep-alive
| |
| X-Content-Type-Options: nosniff
| |
| Access-Control-Allow-Origin: *
| |
| ETag: W/"0vstmhv"
| |
| Cache-Control: public, max-age=300, s-maxage=300
| |
| Expires: Thu, 12 Apr 2018 23:21:50 GMT
| |
| X-XSS-Protection: 1; mode=block
| |
| X-Varnish: 297412 426598
| |
| Age: 3599
| |
| Via: 1.1 varnish-v4
| |
| Accept-Ranges: bytes
| |
| Strict-Transport-Security: max-age=15552001
| |
| Public-Key-Pins: pin-sha256="UbSbHFsFhuCrSv9GNsqnGv4CbaVh5UV5/zzgjLgHh9c="; pin-sha256="YLh1dUR9y6Kja30RrAn7JKnbQG/uEtLMkBgFF2Fuihg="; pin-sha256="C5+lpZ7tcVwmwQIMcRtPbsQtWLABXhQzejna0wHFr8M="; pin-sha256="Vjs8r4z+80wjNcr1YKepWQboSIRi63WsWXhIMN+eWys="; pin-sha256="lCppFqbkrlJ3EcVFAkeip0+44VaoJUymbnOaEUk7tEU="; pin-sha256="K87oWBWM9UZfyddvDfoxL+8lpNyoUB2ptGtn0fv6G2Q="; pin-sha256="Y9mvm0exBk1JoQ57f9Vm28jKo5lFm/woKcVxrYxu80o="; pin-sha256="EGn6R6CqT4z3ERscrqNl7q7RCzJmDe9uBhS/rnCHU="; pin-sha256="NIdnza073SiyuN1TUa7DDGjOxc1p0nbfOCfbxPWAZGQ="; pin-sha256="fNZ8JI9p2D/C+bsB3LH3rWejY9BGBDeW0JhMOiMfa7A="; pin-sha256="oyD01TTXvpfBro3QSZc1vIlcMjrdLTiL/M9mLCPX+Zo="; pin-sha256="0cRTd+vc1hjNFlHcLgLCHXUeWqn80bNDH/bs9qMTSPo="; pin-sha256="MDhNnV1cmaPdDDONbiVionUHH2QIf2aHJwq/lshMWfA="; pin-sha256="OIZP7FgTBf7hUpWHIA7OaPVO2WrsGzTl9vdOHLPZmJU="; max-age=3600; includeSubDomains; report-uri="http:opensourceecology.org/hpkp-report"
| |
| | |
| .lang{background:#F9F9F9;border:1px solid #E9E9E9;font-size:smaller;margin:0 0 1em 0;padding:0.5em 1em;text-align:left}.lang ul{display:inline;margin-left:0;padding-left:0}.lang ul li{border-left:1px solid #E4E4E4;display:inline;list-style:none;margin-left:0;padding:0 0.5em}.lang ul li.lang_main{border-left:none;display:inline;list-style:none;margin-left:0;padding-left:0}.lang ul a.external{background:none ! important;padding-right:0 ! important}.lang ul li.lang_title{display:none}.dtree{font-family:Verdana,Geneva,Arial,Helvetica,sans-serif;font-size:11px;color:#666;white-space:nowrap}.dtree img{border:0px;vertical-align:middle}.dtree a{color:#333;text-decoration:none}.dtree a.node,.dtree a.nodeSel{white-space:nowrap;padding:1px 2px 1px 2px}.dtree a.node:hover,.dtree a.nodeSel:hover{color:#333;text-decoration:underline}.dtree a.nodeSel{background-color:#c0d2ec}.dtree .clip{overflow:hidden;padding-bottom:1px}user@personal:~$
| |
| </pre>
| |
| <pre>
| |
| * << Request >> 297412
| |
| - Begin req 297411 rxreq
| |
| - Timestamp Start: 1523578609.269812 0.000000 0.000000
| |
| - Timestamp Req: 1523578609.269812 0.000000 0.000000
| |
| - ReqStart 127.0.0.1 36238
| |
| - ReqMethod GET
| |
| - ReqURL /load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector
| |
| - ReqProtocol HTTP/1.0
| |
| - ReqHeader X-Real-IP: 76.97.223.185
| |
| - ReqHeader X-Forwarded-For: 76.97.223.185
| |
| - ReqHeader X-Forwarded-Proto: https
| |
| - ReqHeader X-Forwarded-Port: 443
| |
| - ReqHeader Host: wiki.opensourceecology.org
| |
| - ReqHeader Connection: close
| |
| - ReqHeader User-Agent: curl/7.38.0
| |
| - ReqHeader Accept: */*
| |
| - ReqUnset X-Forwarded-For: 76.97.223.185
| |
| - ReqHeader X-Forwarded-For: 76.97.223.185, 127.0.0.1
| |
| - VCL_call RECV
| |
| - ReqUnset X-Forwarded-For: 76.97.223.185, 127.0.0.1
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1
| |
| - VCL_return hash
| |
| - VCL_call HASH
| |
| - VCL_return lookup
| |
| - Hit 426598
| |
| - VCL_call HIT
| |
| - VCL_return deliver
| |
| - RespProtocol HTTP/1.1
| |
| - RespStatus 200
| |
| - RespReason OK
| |
| - RespHeader Date: Thu, 12 Apr 2018 23:16:50 GMT
| |
| - RespHeader Server: Apache
| |
| - RespHeader X-Content-Type-Options: nosniff
| |
| - RespHeader Access-Control-Allow-Origin: *
| |
| - RespHeader ETag: W/"0vstmhv"
| |
| - RespHeader Cache-Control: public, max-age=300, s-maxage=300
| |
| - RespHeader Expires: Thu, 12 Apr 2018 23:21:50 GMT
| |
| - RespHeader X-XSS-Protection: 1; mode=block
| |
| - RespHeader Content-Length: 921
| |
| - RespHeader Content-Type: text/css; charset=utf-8
| |
| - RespHeader X-Varnish: 297412 426598
| |
| - RespHeader Age: 3599
| |
| - RespHeader Via: 1.1 varnish-v4
| |
| - VCL_call DELIVER
| |
| - VCL_return deliver
| |
| - Timestamp Process: 1523578609.269849 0.000036 0.000036
| |
| - Debug "RES_MODE 2"
| |
| - RespHeader Connection: close
| |
| - RespHeader Accept-Ranges: bytes
| |
| - Timestamp Resp: 1523578609.269870 0.000057 0.000021
| |
| - Debug "XXX REF 2"
| |
| - ReqAcct 288 0 288 438 921 1359
| |
| - End
| |
| </pre>
| |
| # but when I call the same thing from my browser, I get a PASS & fetch
| |
| <pre>
| |
| * << Request >> 201266
| |
| - Begin req 201265 rxreq
| |
| - Timestamp Start: 1523578751.880694 0.000000 0.000000
| |
| - Timestamp Req: 1523578751.880694 0.000000 0.000000
| |
| - ReqStart 127.0.0.1 36486
| |
| - ReqMethod GET
| |
| - ReqURL /load.php?debug=false&lang=en&modules=site.styles&only=styles&skin=vector
| |
| - ReqProtocol HTTP/1.0
| |
| - ReqHeader X-Real-IP: 76.97.223.185
| |
| - ReqHeader X-Forwarded-For: 76.97.223.185
| |
| - ReqHeader X-Forwarded-Proto: https
| |
| - ReqHeader X-Forwarded-Port: 443
| |
| - ReqHeader Host: wiki.opensourceecology.org
| |
| - ReqHeader Connection: close
| |
| - ReqHeader User-Agent: Mozilla/5.0 (X11; Fedora; Linux x86_64; rv:50.0) Gecko/20100101 Firefox/50.0
| |
| - ReqHeader Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
| |
| - ReqHeader Accept-Language: en-US,en;q=0.5
| |
| - ReqHeader Accept-Encoding: gzip, deflate, br
| |
| - ReqHeader Upgrade-Insecure-Requests: 1
| |
| - ReqHeader If-None-Match: W/"1a4906v"
| |
| - ReqUnset X-Forwarded-For: 76.97.223.185
| |
| - ReqHeader X-Forwarded-For: 76.97.223.185, 127.0.0.1
| |
| - VCL_call RECV
| |
| - ReqUnset X-Forwarded-For: 76.97.223.185, 127.0.0.1
| |
| - ReqHeader X-Forwarded-For: 127.0.0.1
| |
| - VCL_return pass
| |
| - VCL_call HASH
| |
| - VCL_return lookup
| |
| - VCL_call PASS
| |
| - VCL_return fetch
| |
| - Link bereq 201267 pass
| |
| - Timestamp Fetch: 1523578751.993003 0.112309 0.112309
| |
| - RespProtocol HTTP/1.0
| |
| - RespStatus 304
| |
| - RespReason Not Modified
| |
| - RespHeader Date: Fri, 13 Apr 2018 00:19:11 GMT
| |
| - RespHeader Server: Apache
| |
| - RespHeader ETag: W/"1a4906v"
| |
| - RespHeader Expires: Fri, 13 Apr 2018 00:24:11 GMT
| |
| - RespHeader Cache-Control: public, max-age=300, s-maxage=300
| |
| - RespProtocol HTTP/1.1
| |
| - RespHeader X-Varnish: 201266
| |
| - RespHeader Age: 0
| |
| - RespHeader Via: 1.1 varnish-v4
| |
| - VCL_call DELIVER
| |
| - VCL_return deliver
| |
| - Timestamp Process: 1523578751.993043 0.112350 0.000040
| |
| - Debug "RES_MODE 0"
| |
| - RespHeader Connection: close
| |
| - Timestamp Resp: 1523578751.993091 0.112397 0.000048
| |
| - Debug "XXX REF 1"
| |
| - ReqAcct 540 0 540 258 0 258
| |
| - End
| |
| </pre>
| |
| # I think I just crossed the line of diminished return. The pages themselves are definitely being cached. Images are being cached. Maybe some minification resources are not being cached. I'm confident that the site will run fine after cutover. Then, once it's live in prod, I'll get munin graphs & varnish logs to show me which requests are *not* hits, and I can optimize from there (as was done with osemain by removing the Fundraising addon post migration ).
| |
| | |
| =Wed Apr 11, 2018=
| |
| # I haven't heard back from dreamhost, and they haven't deleted what remains of our backup data yet.
| |
| # checked the state of storage on dreamhost, and found our old backups dir has 52G. Our new backups dirs have 44G.
| |
| <pre>
| |
| hancock% date
| |
| Wed Apr 11 19:40:17 PDT 2018
| |
| hancock% pwd
| |
| /home/marcin_ose
| |
| hancock% du -sh hetzner1/*
| |
| 12G hetzner1/20180409-052001
| |
| 12G hetzner1/20180410-052001
| |
| 12G hetzner1/20180411-052001
| |
| hancock% du -sh hetzner2/*
| |
| 2.8G hetzner2/20180409-072001
| |
| 2.8G hetzner2/20180410-072001
| |
| 2.8G hetzner2/20180411-072001
| |
| hancock% du -sh backups/hetzner1/*
| |
| 248M backups/hetzner1/20180402-052001
| |
| 0 backups/hetzner1/20180406-052001
| |
| 12G backups/hetzner1/20180407-052001
| |
| 12G backups/hetzner1/20180408-052001
| |
| hancock% du -sh backups/hetzner2/*
| |
| 0 backups/hetzner2/20180406-072001
| |
| 14G backups/hetzner2/20180407-072001
| |
| 14G backups/hetzner2/20180408-072001
| |
| hancock%
| |
| </pre>
| |
| # since we have 3x copies of daily backups in the new dir, I just went ahead and deleted the 52G remaining from the old daily backup dir
| |
| <pre>
| |
| hancock% date
| |
| Wed Apr 11 19:43:20 PDT 2018
| |
| hancock% pwd
| |
| /home/marcin_ose/backups
| |
| hancock% du -sh hetzner1/*
| |
| 248M hetzner1/20180402-052001
| |
| 0 hetzner1/20180406-052001
| |
| 12G hetzner1/20180407-052001
| |
| 12G hetzner1/20180408-052001
| |
| hancock% du -sh hetzner2/*
| |
| 0 hetzner2/20180406-072001
| |
| 14G hetzner2/20180407-072001
| |
| 14G hetzner2/20180408-072001
| |
| hancock% rm -rf hetzner1/*
| |
| zsh: sure you want to delete all the files in /home/marcin_ose/backups/hetzner1 [yn]? y
| |
| hancock% rm -rf hetzner2/*
| |
| zsh: sure you want to delete all the files in /home/marcin_ose/backups/hetzner2 [yn]? y
| |
| hancock% ls -lah hetzner1/
| |
| total 4.0K
| |
| drwxr-xr-x 2 marcin_ose pg1589252 10 Apr 11 19:43 .
| |
| drwxr-xr-x 4 marcin_ose pg1589252 4.0K Apr 9 13:20 ..
| |
| hancock% ls -lah hetzner2/
| |
| total 4.0K
| |
| drwxr-xr-x 2 marcin_ose pg1589252 10 Apr 11 19:43 .
| |
| drwxr-xr-x 4 marcin_ose pg1589252 4.0K Apr 9 13:20 ..
| |
| hancock%
| |
| </pre>
| |
| # now our entire home dir's usage is 47G!
| |
| <pre>
| |
| hancock% date
| |
| Wed Apr 11 19:46:23 PDT 2018
| |
| hancock% pwd
| |
| /home/marcin_ose
| |
| hancock% du -sh
| |
| 47G .
| |
| hancock%
| |
| </pre>
| |
| # I expect that once I start working on the wiki again, backups will grow to probably 12*4 + 20*4 = ~128G. Then, after the wiki is migrated, the hetzner1 backups will become negligible (it's mostly just the wiki), so we'll only have ~20*4 = 80G of backups to store. The long-term solution is to migrate to s3 with lifecycle policies on the first-of-the-month daily going into glacier and all other backups getting automatically deleted a few days after upload. The cost difference of this s3 solution between 128G in s3 vs 80G in s3 may be such that I should focus on getting this wiki migrated before recoding our backup scripts to go to s3. While hoping that dreamhost doesn't notice our daily backups (which are significantly smaller than their >500G usage before) have just moved to another dir.
| |
| # wow, within a few minutes from deleting the big directories in the 'backups' dir, I got an email from "Jin K" at dreamhost stating that our "Action Required: Disk usage warning - acceptable use policy violation" ticket was "RESOLVED!", thanking us for taking care of it. Either that response was automated or I got it just before they deleted it for us
| |
| <pre>
| |
| Hi there!
| |
| | |
| It looks like we've applied more time for you previously regarding the
| |
| backup location that was left, but it looks like you've cleared that up
| |
| since then as the location below is now empty:
| |
| | |
| hancock:/home/marcin_ose/backups#
| |
| 96K .
| |
| | |
| Thanks for getting that done! This notice is just to let you know that
| |
| we're all set and this matter is now closed.
| |
| | |
| Please give us a shout at any time if you have any questions or concerns
| |
| at all moving forward. We'd be happy to help!
| |
| | |
| | |
| Thank you kindly,
| |
| | |
| Jin K.
| |
| </pre>
| |
| | |
| =Mon Apr 09, 2018=
| |
| # I confirmed that the backups from last night came into their new location
| |
| <pre>
| |
| hancock% du -sh ../hetzner1/*
| |
| 12G ../hetzner1/20180409-052001
| |
| hancock% du -sh ../hetzner2/*
| |
| 2.8G ../hetzner2/20180409-072001
| |
| hancock%
| |
| </pre>
| |
| # I deleted the encryption key from dreamhost's server. Future backups can be done on hetzner's servers directly.
| |
| <pre>
| |
| hancock% chmod 0700 ose-backups-cron.key
| |
| hancock% shred -u ose-backups-cron.key
| |
| hancock%
| |
| </pre>
| |
| # now the our home dir's entire usage is currently 121G
| |
| <pre>
| |
| hancock% date
| |
| Mon Apr 9 13:18:36 PDT 2018
| |
| hancock% pwd
| |
| /home/marcin_ose
| |
| hancock% du -sh
| |
| 121G .
| |
| hancock%
| |
| </pre>
| |
| # 104G of that is going to be automatically deleted by the cron over the next week as the dailys become stale
| |
| <pre>
| |
| hancock% du -sh backups/*
| |
| 4.0K backups/getGlacierJob.sh
| |
| 48G backups/hetzner1
| |
| 56G backups/hetzner2
| |
| 4.0K backups/output.json
| |
| 4.0K backups/readme.txt
| |
| 64K backups/retryUploadToGlacier.log
| |
| 4.0K backups/retryUploadToGlacier.sh
| |
| 28M backups/uploadToGlacier
| |
| 4.0K backups/uploadToGlacier.py
| |
| 8.0K backups/uploadToGlacier.sh
| |
| hancock% du -sh backups
| |
| 104G backups
| |
| hancock%
| |
| </pre>
| |
| # I deleted the entire uploadToGlacier directory, which only had fileLists that failed to delete due to a minor bug in my script
| |
| <pre>
| |
| hancock% du -sh uploadToGlacier/*
| |
| 2.4M uploadToGlacier/hetzner1_20170701-052001.fileList.txt.bz2
| |
| 2.4M uploadToGlacier/hetzner1_20170801-052001.fileList.txt.bz2
| |
| 2.3M uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2
| |
| 2.3M uploadToGlacier/hetzner1_20180101-062001.fileList.txt.bz2
| |
| 2.4M uploadToGlacier/hetzner1_20180201-062001.fileList.txt.bz2
| |
| 2.4M uploadToGlacier/hetzner1_20180301-062002.fileList.txt.bz2
| |
| 2.2M uploadToGlacier/hetzner1_20180401-052001.fileList.txt.bz2
| |
| 2.0M uploadToGlacier/hetzner2_20170702-052001.fileList.txt.bz2
| |
| 196K uploadToGlacier/hetzner2_20170801-072001.fileList.txt.bz2
| |
| 284K uploadToGlacier/hetzner2_20170901-072001.fileList.txt.bz2
| |
| 648K uploadToGlacier/hetzner2_20171001-072001.fileList.txt.bz2
| |
| 276K uploadToGlacier/hetzner2_20171101-072001.fileList.txt.bz2
| |
| 308K uploadToGlacier/hetzner2_20171202-072001.fileList.txt.bz2
| |
| 488K uploadToGlacier/hetzner2_20180102-072001.fileList.txt.bz2
| |
| 2.4M uploadToGlacier/hetzner2_20180202-072001.fileList.txt.bz2
| |
| 3.4M uploadToGlacier/hetzner2_20180302-072001.fileList.txt.bz2
| |
| 1.6M uploadToGlacier/hetzner2_20180401-072001.fileList.txt.bz2
| |
| hancock% rm -rf uploadToGlacier
| |
| hancock%
| |
| </pre>
| |
| # I updated the crontab to cleanBackups from the new backup dir as well
| |
| <pre>
| |
| hancock% crontab -l
| |
| ###--- BEGIN DREAMHOST BLOCK
| |
| ###--- Changes made to this part of the file WILL be destroyed!
| |
| # Backup site-root
| |
| MAILTO="elifarley@gmail.com"
| |
| @weekly /usr/local/bin/setlock -n /tmp/cronlock.2671804.96324 sh -c $'. \176/altroot/init.sh \012\043 \012\176/bin/mbkp.sh site-root'
| |
| # Backup MONTHLY
| |
| MAILTO="elifarley@gmail.com"
| |
| @monthly /usr/local/bin/setlock -n /tmp/cronlock.2671804.96354 sh -c $'. \176/altroot/init.sh \012\043 \012\176/bin/mbkp.sh home \012\176/bin/mbkp.sh altroot \012\043\176/bin/mbkp.sh blog-cache \012'
| |
| | |
| # delete older backup files
| |
| 20 22 * * * /usr/bin/perl /home/marcin_ose/bin/cleanLocal.pl -l /home/marcin_ose/backups/hetzner1 -d 3 &>> /home/marcin_ose/logs/cleanBackups.log
| |
| 20 22 * * * /usr/bin/perl /home/marcin_ose/bin/cleanLocal.pl -l /home/marcin_ose/backups/hetzner2 -d 3 &>> /home/marcin_ose/logs/cleanBackups.log
| |
| ###--- You can make changes below the next line and they will be preserved!
| |
| ###--- END DREAMHOST BLOCK
| |
| hancock%
| |
| hancock% crontab -e
| |
| ...
| |
| hancock% crontab -l
| |
| ###--- BEGIN DREAMHOST BLOCK
| |
| ###--- Changes made to this part of the file WILL be destroyed!
| |
| # Backup site-root
| |
| MAILTO="elifarley@gmail.com"
| |
| @weekly /usr/local/bin/setlock -n /tmp/cronlock.2671804.96324 sh -c $'. \176/altroot/init.sh \012\043 \012\176/bin/mbkp.sh site-root'
| |
| # Backup MONTHLY
| |
| MAILTO="elifarley@gmail.com"
| |
| @monthly /usr/local/bin/setlock -n /tmp/cronlock.2671804.96354 sh -c $'. \176/altroot/init.sh \012\043 \012\176/bin/mbkp.sh home \012\176/bin/mbkp.sh altroot \012\043\176/bin/mbkp.sh blog-cache \012'
| |
| | |
| # delete older backup files
| |
| 20 22 * * * /usr/bin/perl /home/marcin_ose/bin/cleanLocal.pl -l /home/marcin_ose/backups/hetzner1 -d 3 &>> /home/marcin_ose/logs/cleanBackups.log
| |
| 20 22 * * * /usr/bin/perl /home/marcin_ose/bin/cleanLocal.pl -l /home/marcin_ose/backups/hetzner2 -d 3 &>> /home/marcin_ose/logs/cleanBackups.log
| |
| 20 22 * * * /usr/bin/perl /home/marcin_ose/bin/cleanLocal.pl -l /home/marcin_ose/hetzner1 -d 3 &>> /home/marcin_ose/logs/cleanBackups.log
| |
| 20 22 * * * /usr/bin/perl /home/marcin_ose/bin/cleanLocal.pl -l /home/marcin_ose/hetzner2 -d 3 &>> /home/marcin_ose/logs/cleanBackups.log
| |
| ###--- You can make changes below the next line and they will be preserved!
| |
| ###--- END DREAMHOST BLOCK
| |
| hancock%
| |
| </pre>
| |
| | |
| =Sun Apr 08, 2018=
| |
| # I checked again just after midnight; the retry appears to have worked pretty great. Just 2 archives failed on this run
| |
| <pre>
| |
| hancock% date
| |
| Sat Apr 7 22:14:46 PDT 2018
| |
| hancock% pwd
| |
| /home/marcin_ose/backups
| |
| hancock% du -sh uploadToGlacier/*.gpg
| |
| 39G uploadToGlacier/hetzner1_20170801-052001.tar.gpg
| |
| 12G uploadToGlacier/hetzner1_20180101-062001.tar.gpg
| |
| hancock%
| |
| </pre>
| |
| # the archive list on hetzner2 looks pretty great too
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170701-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20170801-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| hetzner1_20171001-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20171001-052001.tar.gpg
| |
| hetzner1_20171101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171101-062001.tar.gpg
| |
| hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180201-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180201-062001.tar.gpg
| |
| hetzner1_20180301-062002.fileList.txt.bz2.gpg
| |
| hetzner1_20180301-062002.tar.gpg
| |
| hetzner1_20180401-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20180401-052001.tar.gpg
| |
| hetzner2_20170702-052001.fileList.txt.bz2.gpg
| |
| hetzner2_20170702-052001.tar.gpg
| |
| hetzner2_20170801-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20170801-072001.tar.gpg
| |
| hetzner2_20170901-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20170901-072001.tar.gpg
| |
| hetzner2_20171001-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171001-072001.tar.gpg
| |
| hetzner2_20171101-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171101-072001.tar.gpg
| |
| hetzner2_20171202-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171202-072001.tar.gpg
| |
| hetzner2_20180102-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180102-072001.tar.gpg
| |
| hetzner2_20180202-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180302-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180401-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180401-072001.tar.gpg
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # I kicked off a fresh inventory fetch; hopefully that'll get the ones that I just uploaded on retry
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync --max-age=0 --wait deleteMeIn2020
| |
| | |
| </pre>
| |
| # meanwhile, back on dreamhost's hancock, I kicked off the re-upload attempt for the remaining 2x archives
| |
| # I also checked the aws console; the bill for April so far is $3.32. Not bad!
| |
| ## most of that is $3.25 for 65,038 requests.
| |
| # Unfortunately, I discovered that the AWS Budget service is itself not free. We apparently have $0 charges because there's 62 days of free Budget service in the Free Tier.
| |
| ## according to their docs, the first 2x budgets are free of charge. Additional budgets are $0.02/day. I'll leave our $10 budget email. If we get charged >$0 for it ever, I'll delete it.
| |
| # I changed the existing budget from $1 to $8 (so $96/yr). I added an alert for both exceeds actual & foretasted amounts. fwiw, we're currently only being charged $2.28, but the forecast is $9.25. I assume that's expecting us to keep uploading at our current rate all month, which won't happen..
| |
| # ...
| |
| # when I woke up, the 2x remaining uploads completed successfully!
| |
| # I checked on the archive list, but it still didn't show the complete list; so I kicked off another inventory refresh
| |
| # after the inventory shows all the archives, I'll delete them from dreamhost. Tomorrow is the deadline, so hopefully this can be done today.
| |
| # I got an email that the projected budget was to exceed the $8 budget. The actual budget is still $3.97.
| |
| # ...
| |
| # a few hours later, the inventory sync was complete, and many of the archives were now listed
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170701-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20170701-052001.tar.gpg
| |
| hetzner1_20170801-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| hetzner1_20171001-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20171001-052001.tar.gpg
| |
| hetzner1_20171101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171101-062001.tar.gpg
| |
| id:lryfyQFE4NbtWg5Q6uTq8Qqyc-y9il9WYe7lHs8H2lzFSBADOJQmCIgp6FxrkiaCcwnSMIReJPWyWcR4UOnurxwONhw8fojEHQTTeOpkf6fgfWBAPP9P6GOZZ0v8d8Jz_-QFVaV6Bw hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| id:NR3Z9zdD2rW0NG1y3QW735TzykIivP_cnFDMCNX6RcIPh0mRb_6QiC5qy1GrBTIoroorfzaGDIKQ0BY18jbcR3XfEzfcmrZ1FiT1YvQw-c1ag6vT46-noPvmddZ_zyy2O1ItIygI6Q hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171201-062001.tar.gpg
| |
| hetzner1_20180101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180201-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180201-062001.tar.gpg
| |
| hetzner1_20180301-062002.fileList.txt.bz2.gpg
| |
| hetzner1_20180301-062002.tar.gpg
| |
| hetzner1_20180401-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20180401-052001.tar.gpg
| |
| hetzner2_20170702-052001.fileList.txt.bz2.gpg
| |
| hetzner2_20170702-052001.tar.gpg
| |
| hetzner2_20170801-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20170801-072001.tar.gpg
| |
| hetzner2_20170901-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20170901-072001.tar.gpg
| |
| hetzner2_20171001-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171001-072001.tar.gpg
| |
| hetzner2_20171101-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171101-072001.tar.gpg
| |
| hetzner2_20171202-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171202-072001.tar.gpg
| |
| hetzner2_20180102-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180102-072001.tar.gpg
| |
| hetzner2_20180202-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180202-072001.tar.gpg
| |
| hetzner2_20180302-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180302-072001.tar.gpg
| |
| hetzner2_20180401-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180401-072001.tar.gpg
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| ## we can see that these archives appear obviously to be in the glacier vault (both encrypted tarballs are present & named as exptected)
| |
| <pre>
| |
| hetzner1_20170701-052001
| |
| hetzner1_20171001-052001
| |
| hetzner1_20171101-062001
| |
| hetzner1_20180201-062001
| |
| hetzner1_20180301-062002
| |
| hetzner1_20180401-052001
| |
| hetzner2_20170702-052001
| |
| hetzner2_20170801-072001
| |
| hetzner2_20170901-072001
| |
| hetzner2_20171001-072001
| |
| hetzner2_20171101-072001
| |
| hetzner2_20171202-072001
| |
| hetzner2_20180102-072001
| |
| hetzner2_20180202-072001
| |
| hetzner2_20180302-072001
| |
| hetzner2_20180401-072001
| |
| </pre>
| |
| ## but there's a couple more that have an odd naming convention and/or duplicate archives from when I was testing the upload script; these also appear to be in the glacier vault:
| |
| <pre>
| |
| hetzner1_20170901-052001
| |
| hetzner1_20171201-062001
| |
| </pre>
| |
| ## finally, there's a couple where I didn't include the file name in the archive description, from my very early testing--back before I realized the archive name wouldn't be remembered by glacier, and that it would only be given a reference uid. from my notes, I know that these archives correspond to:
| |
| <pre>
| |
| hetzner1_20171001-052001
| |
| </pre>
| |
| # therefore, the following archives have been fully uploaded into glacier, and they can be deleted from dreamhost:
| |
| <pre>
| |
| hetzner1_20170701-052001
| |
| hetzner1_20170901-052001
| |
| hetzner1_20171001-052001
| |
| hetzner1_20171101-062001
| |
| hetzner1_20171201-062001
| |
| hetzner1_20180201-062001
| |
| hetzner1_20180301-062002
| |
| hetzner1_20180401-052001
| |
| hetzner2_20170702-052001
| |
| hetzner2_20170801-072001
| |
| hetzner2_20170901-072001
| |
| hetzner2_20171001-072001
| |
| hetzner2_20171101-072001
| |
| hetzner2_20171202-072001
| |
| hetzner2_20180102-072001
| |
| hetzner2_20180202-072001
| |
| hetzner2_20180302-072001
| |
| hetzner2_20180401-072001
| |
| </pre>
| |
| # I kicked-off the deletions
| |
| <pre>
| |
| hancock% rm -rf hetzner1/20170701-052001
| |
| hancock% rm -rf hetzner1/20170701-052001
| |
| hancock% rm -rf hetzner1/20171001-052001
| |
| hancock% rm -rf hetzner1/20171101-062001
| |
| hancock% rm -rf hetzner1/20180201-062001
| |
| hancock% rm -rf hetzner1/20180301-062002
| |
| hancock% rm -rf hetzner1/20180401-052001
| |
| hancock% rm -rf hetzner2/20170702-052001
| |
| hancock% rm -rf hetzner2/20170801-072001
| |
| hancock% rm -rf hetzner2/20170901-072001
| |
| hancock% rm -rf hetzner2/20171001-072001
| |
| hancock% rm -rf hetzner2/20171101-072001
| |
| hancock% rm -rf hetzner2/20171202-072001
| |
| hancock% rm -rf hetzner2/20180102-072001
| |
| hancock% rm -rf hetzner2/20180202-072001
| |
| hancock% rm -rf hetzner2/20180302-072001
| |
| hancock% rm -rf hetzner2/20180401-072001
| |
| hancock% rm -rf hetzner1/20170901-052001
| |
| hancock% rm -rf hetzner1/20171201-062001
| |
| hancock% rm -rf hetzner1/20171001-052001
| |
| hancock%
| |
| </pre>
| |
| # and here's what remains
| |
| <pre>
| |
| hancock% du -sh hetzner1/*
| |
| 39G hetzner1/20170801-052001
| |
| 12G hetzner1/20180101-062001
| |
| 248M hetzner1/20180402-052001
| |
| 0 hetzner1/20180403-052001
| |
| 12G hetzner1/20180404-052001
| |
| 12G hetzner1/20180405-052001
| |
| 12G hetzner1/20180406-052001
| |
| 12G hetzner1/20180407-052001
| |
| 12G hetzner1/20180408-052001
| |
| hancock% du -sh hetzner2/*
| |
| 0 hetzner2/20180403-072001
| |
| 14G hetzner2/20180404-072001
| |
| 14G hetzner2/20180405-072001
| |
| 14G hetzner2/20180406-072001
| |
| 14G hetzner2/20180407-072001
| |
| 14G hetzner2/20180408-072001
| |
| hancock%
| |
| </pre>
| |
| ## so that finishes off hetzner2. The backups that are present are just the recent few days worth (eventually dailys may have to go to s3, but for now I'm primarily focused on shipping our historical monthlies off to somewhere safe = glacier)
| |
| ## hetzner1 has 2x remaining archives that need to be confirmed in glacier, then deleted from dreamhost:
| |
| <pre>
| |
| hancock% du -sh hetzner1/*
| |
| 39G hetzner1/20170801-052001
| |
| 12G hetzner1/20180101-062001
| |
| </pre>
| |
| # unfortunately, the hetzner2 backups have exploded to 14G again; they should be ~3G each. Looks like I still had an 'orig' copy of the archives I restored when testing glacier in the /root/glacierRestore directory. The '/root'/ directory is itself backed-up.
| |
| ## I updated the "restore from glacier" documentation on the wiki so that the 'glacier-cli' dir is placed in '/root/sandbox/', the 'glacier.py' binary is linked to by '/root/bin/glacier.py' (/root/bin is already in $PATH), and that the restores themselves get done in a temporary directory in /var/tmp/
| |
| # I updated the upload path in '/root/backups/backup.settings' to be '/home/marcin_ose/hetzner2' instead of '/home/marcin_ose/backups/ hetzner2'
| |
| # I updated the upload path in '/usr/home/osemain/backups/backup.settings' to be '/home/marcin_ose/hetzner1' instead of '/home/marcin_ose/backups/hetzner1'
| |
| # ...
| |
| # I checked again just before midnight, and here's the new listing
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170701-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20170701-052001.tar.gpg
| |
| hetzner1_20170801-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20170801-052001.tar.gpg
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| hetzner1_20171001-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20171001-052001.tar.gpg
| |
| hetzner1_20171101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171101-062001.tar.gpg
| |
| id:lryfyQFE4NbtWg5Q6uTq8Qqyc-y9il9WYe7lHs8H2lzFSBADOJQmCIgp6FxrkiaCcwnSMIReJPWyWcR4UOnurxwONhw8fojEHQTTeOpkf6fgfWBAPP9P6GOZZ0v8d8Jz_-QFVaV6Bw hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| id:NR3Z9zdD2rW0NG1y3QW735TzykIivP_cnFDMCNX6RcIPh0mRb_6QiC5qy1GrBTIoroorfzaGDIKQ0BY18jbcR3XfEzfcmrZ1FiT1YvQw-c1ag6vT46-noPvmddZ_zyy2O1ItIygI6Q hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171201-062001.tar.gpg
| |
| hetzner1_20180101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180101-062001.tar.gpg
| |
| hetzner1_20180201-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20180201-062001.tar.gpg
| |
| hetzner1_20180301-062002.fileList.txt.bz2.gpg
| |
| hetzner1_20180301-062002.tar.gpg
| |
| hetzner1_20180401-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20180401-052001.tar.gpg
| |
| hetzner2_20170702-052001.fileList.txt.bz2.gpg
| |
| hetzner2_20170702-052001.tar.gpg
| |
| hetzner2_20170801-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20170801-072001.tar.gpg
| |
| hetzner2_20170901-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20170901-072001.tar.gpg
| |
| hetzner2_20171001-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171001-072001.tar.gpg
| |
| hetzner2_20171101-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171101-072001.tar.gpg
| |
| hetzner2_20171202-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20171202-072001.tar.gpg
| |
| hetzner2_20180102-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180102-072001.tar.gpg
| |
| hetzner2_20180202-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180202-072001.tar.gpg
| |
| hetzner2_20180302-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180302-072001.tar.gpg
| |
| hetzner2_20180401-072001.fileList.txt.bz2.gpg
| |
| hetzner2_20180401-072001.tar.gpg
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| ## that includes the following, which are the 2x archives were previously absent
| |
| <pre>
| |
| hetzner1/20170801-052001
| |
| hetzner1/20180101-062001
| |
| </pre>
| |
| # I deleted those archives from dreamhost
| |
| <pre>
| |
| rm -rf hetzner1/20170801-052001
| |
| rm -rf hetzner1/20180101-062001
| |
| </pre>
| |
| # that leaves only recent daily archives
| |
| <pre>
| |
| hancock% du -sh hetzner1/*
| |
| 248M hetzner1/20180402-052001
| |
| 0 hetzner1/20180403-052001
| |
| 12G hetzner1/20180404-052001
| |
| 12G hetzner1/20180405-052001
| |
| 12G hetzner1/20180406-052001
| |
| 12G hetzner1/20180407-052001
| |
| 12G hetzner1/20180408-052001
| |
| hancock% du -sh hetzner2/*
| |
| 0 hetzner2/20180403-072001
| |
| 14G hetzner2/20180404-072001
| |
| 14G hetzner2/20180405-072001
| |
| 14G hetzner2/20180406-072001
| |
| 14G hetzner2/20180407-072001
| |
| 14G hetzner2/20180408-072001
| |
| hancock%
| |
| </pre>
| |
| | |
| =Sat Apr 07, 2018=
| |
| # checked dreamhost; the screen died (damn dreamhost), so I can't see the last command's output (shoulda sent it to a log file..)
| |
| # anyway, I can tell which files appeared to have failed from the gpg files in the dir
| |
| <pre>
| |
| hancock% date
| |
| Sat Apr 7 08:27:09 PDT 2018
| |
| hancock% pwd
| |
| /home/marcin_ose/backups/uploadToGlacier
| |
| hancock% ls -lah *.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 39G Apr 4 18:59 hetzner1_20170701-052001.tar.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 39G Apr 4 22:13 hetzner1_20170801-052001.tar.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 2.3M Apr 3 16:15 hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 12G Apr 3 16:37 hetzner1_20171201-062001.tar.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 12G Apr 5 00:51 hetzner1_20180101-062001.tar.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 14G Apr 4 11:00 hetzner2_20180202-072001.tar.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 25G Apr 4 12:39 hetzner2_20180302-072001.tar.gpg
| |
| hancock%
| |
| </pre>
| |
| # I can't confirm as the inventory is stale, but I kicked-off a sync
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| hetzner1_20171001-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20171001-052001.tar.gpg
| |
| hetzner1_20171101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171101-062001.tar.gpg
| |
| hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| vault sync deleteMeIn2020
| |
| glacier: queued inventory job for u'deleteMeIn2020'
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # so this is the list of failed backups that I need to retry. It's ~150G, which is too big to stash on hetzner2; hopefully this retry knocks it below 90G, which is how much free space we have on hetzner2
| |
| <pre>
| |
| hancock% du -sh *.gpg
| |
| 39G hetzner1_20170701-052001.tar.gpg
| |
| 39G hetzner1_20170801-052001.tar.gpg
| |
| 2.3M hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| 12G hetzner1_20171201-062001.tar.gpg
| |
| 12G hetzner1_20180101-062001.tar.gpg
| |
| 14G hetzner2_20180202-072001.tar.gpg
| |
| 25G hetzner2_20180302-072001.tar.gpg
| |
| hancock%
| |
| </pre>
| |
| # the most concerning is the first backups I made on the hetzner1 server on 201707. I made that backup just before I deleted anything, so we really, really need that in glacier.
| |
| # I copied the uploadToGlacier.sh script to a modified new script named retryUploadToGlaicer.sh
| |
| <pre>
| |
| hancock% cat retryUploadToGlacier.sh
| |
| #!/bin/bash -x
| |
| | |
| ############
| |
| # SETTINGS #
| |
| ############
| |
| | |
| backupArchives="uploadToGlacier/hetzner1_20170701-052001.tar.gpg uploadToGlacier/hetzner1_20170801-052001.tar.gpg uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2.gpg uploadToGlacier/hetzner1_20171201-062001.tar.gpg uploadToGlacier/hetzner1_20180101-062001.tar.gpg uploadToGlacier/hetzner2_20180202-072001.tar.gpg uploadToGlacier/hetzner2_20180302-072001.tar.gpg"
| |
| | |
| export AWS_ACCESS_KEY_ID='CHANGEME'
| |
| export AWS_SECRET_ACCESS_KEY='CHANGEME'
| |
| | |
| ##############
| |
| # DO UPLOADS #
| |
| ##############
| |
| | |
| for archive in $(echo $backupArchives); do
| |
| | |
| # upload it
| |
| /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 "${archive}"
| |
| | |
| if $? -eq 0 ; then
| |
| rm -f "${archive}"
| |
| fi
| |
| | |
| done
| |
| hancock%
| |
| </pre>
| |
| # kicked off the script, this time logging to a file
| |
| <pre>
| |
| hancock% ./retryUploadToGlacier.sh &> retryUploadToGlacier.log
| |
| </pre>
| |
| # it's uploading the 201707 archive to hetzner1
| |
| <pre>
| |
| hancock% tail -f retryUploadToGlacier.log
| |
| + backupArchives='uploadToGlacier/hetzner1_20170701-052001.tar.gpg uploadToGlacier/hetzner1_20170801-052001.tar.gpg uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2.gpg uploadToGlacier/hetzner1_20171201-062001.tar.gpg uploadToGlacier/hetzner1_20180101-062001.tar.gpg uploadToGlacier/hetzner2_20180202-072001.tar.gpg uploadToGlacier/hetzner2_20180302-072001.tar.gpg'
| |
| + export AWS_ACCESS_KEY_ID=CHANGEME
| |
| + AWS_ACCESS_KEY_ID=CHANGEME
| |
| + export AWS_SECRET_ACCESS_KEY=CHANGEME
| |
| + AWS_SECRET_ACCESS_KEY=CHANGEME
| |
| ++ echo uploadToGlacier/hetzner1_20170701-052001.tar.gpg uploadToGlacier/hetzner1_20170801-052001.tar.gpg uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2.gpg uploadToGlacier/hetzner1_20171201-062001.tar.gpg uploadToGlacier/hetzner1_20180101-062001.tar.gpg uploadToGlacier/hetzner2_20180202-072001.tar.gpg uploadToGlacier/hetzner2_20180302-072001.tar.gpg
| |
| + for archive in '$(echo $backupArchives)'
| |
| + /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 uploadToGlacier/hetzner1_20170701-052001.tar.gpg
| |
| </pre>
| |
| ## I'll check on this in the evening
| |
| | |
| =Thr Apr 05, 2018=
| |
| # the backup script is still running on the dreamhost server
| |
| ## It's currently on 'hetzner1_20180401-052001', which is the last one!
| |
| ## It looks like some uploads have succeeded & some have failed. the staging dir = 'uploadToGlacier/', which has all the encrypted archives (those that failed intentionally don't get deleted so I can re-try them later) has swelled to 150G. I already deleted ~250G, so I think dreamhost can deal with that (at least for 4 more days, which is our deadline)
| |
| # I got an email saying that our budget of $1/mo was exceeded with our aws account. Good to know that works! I'll increase that to $10 soon
| |
| # I'll come back to this tomorrow, after the script has finished running. Then I'll try uploading the remaining files sequentially in a loop, and hopefully they'll all be done by our deadline
| |
| # Marcin emailed me regarding broken images on the workshops page of osemain https://www.opensourceecology.org/workshops-and-programs/ .
| |
| ## I can't fix this now as my phone is broken & I have to rebuild it + restore my 2fa tokens to login, but I told him the fix was to replace the absolute ulrs with the wrong protocol (http) with just a relative path to the image which is more robust
| |
| | |
| =Wed Apr 04, 2018=
| |
| # the upload of the archive 'hetzner1_20171201-062001.tar.gpg' failed again last night!
| |
| <pre>
| |
| + tar -cvf /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar hetzner1/20171201-062001/
| |
| hetzner1/20171201-062001/
| |
| hetzner1/20171201-062001/public_html/
| |
| hetzner1/20171201-062001/public_html/public_html.20171201-062001.tar.bz2
| |
| + gpg --symmetric --cipher-algo aes --passphrase-file /home/marcin_ose/backups/ose-backups-cron.key /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar
| |
| Reading passphrase from file descriptor 3
| |
| + rm /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar
| |
| + /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar.gpg
| |
| Traceback (most recent call last):
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 736, in <module>
| |
| main()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 732, in main
| |
| App().main()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 718, in main
| |
| self.args.func()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 500, in archive_upload
| |
| file_obj=self.args.file, description=name)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/vault.py", line 178, in create_archive_from_file
| |
| writer.write(data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 219, in write
| |
| self.partitioner.write(data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 61, in write
| |
| self._send_part()
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 75, in _send_part
| |
| self.send_fn(part)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 222, in _upload_part
| |
| self.uploader.upload_part(self.next_part_index, part_data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 129, in upload_part
| |
| content_range, part_data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/layer1.py", line 637, in upload_part
| |
| response_headers=response_headers)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/layer1.py", line 84, in make_request
| |
| raise UnexpectedHTTPResponseError(ok_responses, response)
| |
| boto.glacier.exceptions.UnexpectedHTTPResponseError: Expected 204, got (408, code=RequestTimeoutException, message=Request timed out.)
| |
| + 1 -eq 0
| |
| marcin_ose@hancock:~/backups$
| |
| </pre>
| |
| #the good news is that my new if statement prevented it from being deleted
| |
| <pre>
| |
| hancock% ls -lah uploadToGlacier
| |
| total 12G
| |
| drwxr-xr-x 2 marcin_ose pg1589252 4.0K Apr 3 16:37 .
| |
| drwxr-xr-x 5 marcin_ose pg1589252 4.0K Apr 3 15:37 ..
| |
| -rw-r--r-- 1 marcin_ose pg1589252 2.3M Apr 3 16:14 hetzner1_20171201-062001.fileList.txt.bz2
| |
| -rw-r--r-- 1 marcin_ose pg1589252 2.3M Apr 3 16:15 hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| -rw-r--r-- 1 marcin_ose pg1589252 12G Apr 3 16:37 hetzner1_20171201-062001.tar.gpg
| |
| hancock%
| |
| </pre>
| |
| # I found a lot of people complain about this timeout issue. One post mentioned that we could increase the boto num_retries from the default (None) to some small number, but someone responded saying it didn't help them.. https://github.com/uskudnik/amazon-glacier-cmd-interface/issues/171
| |
| ## err, I checked the documentation, and it shows the default is actually '5', not '0' http://docs.pythonboto.org/en/latest/boto_config_tut.html#boto
| |
| # I am concerned that these 408 errors & all our retries will result in higher bills. Even if only because of higher requests because--say--an upload that would require 1,000 requests now requires 1,500 requests because of the retry.
| |
| # I checked the aws console, and our bill for March is $0.41.
| |
| ## Not bad considering we did a restore of 12G of data. It lists a $0.01 fee for restoring 1.187G of data. Indeed, I restored ~12G of data, but it looks like aws free tier permits 10G per month of data retrievals. So we got that restore test for essentially free; awesome.
| |
| ## our largest spend is 7,797 requests at $0.039. I'm afraid that increasing the chunk size to decrease the # of requests may increase the timeout risk? I think I'll just stick with the default & cross my fingers
| |
| ## and $0.01 for 2.456 GB-Mo storage fee
| |
| # I did a sync & listed the archives on hetzner2, and now it lists all the files I'd expect, except the one that keeps failing with timeout issues
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync deleteMeIn2020
| |
| [root@hetzner2 glacier-cli]# ./glacier.py archive list deleteMeIn2020
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| hetzner1_20171001-052001.fileList.txt.bz2.gpg
| |
| hetzner1_20171001-052001.tar.gpg
| |
| hetzner1_20171101-062001.fileList.txt.bz2.gpg
| |
| hetzner1_20171101-062001.tar.gpg
| |
| hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # except for the timeout issues, this process appears to work; we have only 5 days to delete all this data, so I'll kick-off the upload for all remaining archives using the script
| |
| <pre>
| |
| backupDirs="hetzner2/20170702-052001 hetzner2/20170801-072001 hetzner2/20170901-072001 hetzner2/20171001-072001 hetzner2/20171101-072001 hetzner2/20171202-072001 hetzner2/20180102-072001 hetzner2/20180202-072001 hetzner2/20180302-072001 hetzner2/20180401-072001 hetzner1/20170701-052001 hetzner1/20170801-052001 hetzner1/20180101-062001 hetzner1/20180201-062001 hetzner1/20180301-062002 hetzner1/20180401-052001"
| |
| </pre>
| |
| # and the manual re-upload attempt of 'hetzner1_20171201-062001.tar.gpg' that failed with the timeout failed again with timeout; I'll hold-off on the reupload until the other script finishes; hopefully it will just be a small subset of archives that I need to retry
| |
| <pre>
| |
| hancock% /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar.gpg
| |
| | |
| Traceback (most recent call last):
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 736, in <module>
| |
| main()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 732, in main
| |
| App().main()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 718, in main
| |
| self.args.func()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 500, in archive_upload
| |
| file_obj=self.args.file, description=name)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/vault.py", line 178, in create_archive_from_file
| |
| writer.write(data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 219, in write
| |
| self.partitioner.write(data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 61, in write
| |
| self._send_part()
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 75, in _send_part
| |
| self.send_fn(part)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 222, in _upload_part
| |
| self.uploader.upload_part(self.next_part_index, part_data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 129, in upload_part
| |
| content_range, part_data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/layer1.py", line 637, in upload_part
| |
| response_headers=response_headers)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/layer1.py", line 84, in make_request
| |
| raise UnexpectedHTTPResponseError(ok_responses, response)
| |
| boto.glacier.exceptions.UnexpectedHTTPResponseError: Expected 204, got (408, code=RequestTimeoutException, message=Request timed out.)
| |
| hancock%
| |
| </pre>
| |
| # I just discovered that there's 2x versions of glacier-cli; one was forked from the other https://github.com/basak/glacier-cli/pull/24
| |
| ## https://github.com/pkaleta/glacier-cli
| |
| ## https://github.com/basak/glacier-cli
| |
| # determined that our dreamhost glacier-cli sandbox was checked-out from basak's repo, which was last updated on feb 6. this is the original repo.
| |
| <pre>
| |
| hancock% pwd
| |
| /home/marcin_ose/sandbox/glacier-cli
| |
| hancock% git remote show origin
| |
| * remote origin
| |
| Fetch URL: git://github.com/basak/glacier-cli.git
| |
| Push URL: git://github.com/basak/glacier-cli.git
| |
| HEAD branch: master
| |
| Remote branches:
| |
| annex-hook-script tracked
| |
| master tracked
| |
| Local branch configured for 'git pull':
| |
| master merges with remote master
| |
| Local ref configured for 'git push':
| |
| master pushes to master (up to date)
| |
| hancock%
| |
| </pre>
| |
| # determined that the hetzner2 repo uses the same origin
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# pwd
| |
| /root/backups/glacierRestore/glacier-cli
| |
| [root@hetzner2 glacier-cli]# git remote show origin
| |
| * remote origin
| |
| Fetch URL: git://github.com/basak/glacier-cli.git
| |
| Push URL: git://github.com/basak/glacier-cli.git
| |
| HEAD branch: master
| |
| Remote branches:
| |
| annex-hook-script tracked
| |
| master tracked
| |
| Local branch configured for 'git pull':
| |
| master merges with remote master
| |
| Local ref configured for 'git push':
| |
| master pushes to master (up to date)
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # important to note that the 2x repos document different commands for upload.
| |
| ## the 'basak' repo is more basic
| |
| <pre>
| |
| glacier vault list
| |
| glacier vault create vault-name
| |
| glacier vault sync [--wait] [--fix] [--max-age hours] vault-name
| |
| glacier archive list vault-name
| |
| glacier archive upload [--name archive-name] vault-name filename
| |
| glacier archive retrieve [--wait] [-o filename] [--multipart-size bytes] vault-name archive-name
| |
| glacier archive retrieve [--wait] [--multipart-size bytes] vault-name archive-name [archive-name...]
| |
| glacier archive delete vault-name archive-name
| |
| glacier job list
| |
| </pre>
| |
| ## but the 'pkaleta' repo has options for the part size & thread count for the upload
| |
| <pre>
| |
| glacier vault list
| |
| glacier vault create vault-name
| |
| glacier vault sync [--wait] [--fix] [--max-age hours] vault-name
| |
| glacier archive list vault-name
| |
| glacier archive upload [--encrypt] [--concurrent [--part-size size] [--num-threads count]] [--name archive-name] vault-name filename
| |
| glacier archive retrieve [--wait] [--decrypt] [-o filename] [--part-size bytes] vault-name archive-name [archive-name...]
| |
| glacier archive delete vault-name archive-name
| |
| glacier job list
| |
| </pre>
| |
| # so if I have horrible timeout issues from the script as-is, I'll try manual re-uploads using the 'pkaleta' repo to see if I have better results
| |
| | |
| =Tue Apr 03, 2018=
| |
| # Marcin pointed out some 'disk full' sql issues on our wordpress site (why is our site leaking error logs anyway?!?). I accidentally filled the disk when testing the glacier restore (12G down + 12G copying backup file + 12G decrypted + >12G uncompressed .. it adds up!). Because we're using varnish, it didn't take the whole site down, just the cache misses. I deleted my glacier restore files & banned the whole varnish cache for all sites to fix it.
| |
| # checked the inventory, but the backups I uploaded last night were not listed yet
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py archive list deleteMeIn2020
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]# ./glacier.py -h
| |
| </pre>
| |
| # kicked-off a sync; I'll check on this tomorrow
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync deleteMeIn2020
| |
| glacier: queued inventory job for u'deleteMeIn2020'
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # the upload script output shows a traceback dump due to an exception: boto.glacier.exceptions.UnexpectedHTTPResponseError: Expected 204, got (408, code=RequestTimeoutException, message=Request timed out.)
| |
| <pre>
| |
| marcin_ose@hancock:~/backups$ ./uploadToGlacier.sh
| |
| + backupDirs='hetzner1/20171101-062001 hetzner1/20171201-062001'
| |
| + syncDir=/home/marcin_ose/backups/uploadToGlacier
| |
| + encryptionKeyFilePath=/home/marcin_ose/backups/ose-backups-cron.key
| |
| + export AWS_ACCESS_KEY_ID=CHANGEME
| |
| + AWS_ACCESS_KEY_ID=CHANGEME
| |
| + export AWS_SECRET_ACCESS_KEY=CHANGEME
| |
| + AWS_SECRET_ACCESS_KEY=CHANGEME
| |
| ++ echo hetzner1/20171101-062001 hetzner1/20171201-062001
| |
| + for dir in '$(echo $backupDirs)'
| |
| ++ echo hetzner1/20171101-062001
| |
| ++ tr / _
| |
| + archiveName=hetzner1_20171101-062001
| |
| ++ date -u --rfc-3339=seconds
| |
| + timestamp='2018-04-02 18:15:42+00:00'
| |
| + fileListFilePath=/home/marcin_ose/backups/uploadToGlacier/hetzner1_20171101-062001.fileList.txt
| |
| + archiveFilePath=/home/marcin_ose/backups/uploadToGlacier/hetzner1_20171101-062001.tar
| |
| + echo ================================================================================
| |
| + echo 'This file is metadata for the archive '\hetzner1_20171101-062001'\. In it, we list all the files included in the
| |
| compressed/encrypted archive (produced using '\ls -lahR hetzner1/20171101-062001'\), including the files within the tarba
| |
| lls within the archive (produced using '\find hetzner1/20171101-062001 -type f -exec tar -tvf '\{}'\ \; '\)'
| |
| + echo ''
| |
| + echo ' - Michael Altfield <maltfield@opensourceecology.org>'
| |
| + echo ''
| |
| + echo ' Note: this file was generated at 2018-04-02 18:15:42+00:00'
| |
| + echo ================================================================================
| |
| + echo '#############################'
| |
| + echo '# '\ls -lahR'\ output follows #'
| |
| + echo '#############################'
| |
| + ls -lahR hetzner1/20171101-062001
| |
| + echo ================================================================================
| |
| | |
| + echo '############################'
| |
| + echo '# tarball contents follows #'
| |
| + echo '############################'
| |
| + find hetzner1/20171201-062001 -type f -exec tar -tvf '{}' ';'
| |
| + echo ================================================================================
| |
| + bzip2 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt
| |
| + gpg --symmetric --cipher-algo aes --passphrase-file /home/marcin_ose/backups/ose-backups-cron.key /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2
| |
| Reading passphrase from file descriptor 3
| |
| + rm /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt
| |
| rm: cannot remove ‘/home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt’: No such file or directory
| |
| + /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2.gpg
| |
| + tar -cvf /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar hetzner1/20171201-062001/
| |
| hetzner1/20171201-062001/
| |
| hetzner1/20171201-062001/public_html/
| |
| hetzner1/20171201-062001/public_html/public_html.20171201-062001.tar.bz2
| |
| + gpg --symmetric --cipher-algo aes --passphrase-file /home/marcin_ose/backups/ose-backups-cron.key /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar
| |
| Reading passphrase from file descriptor 3
| |
| + rm /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar
| |
| + /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar.gpg
| |
| Traceback (most recent call last):
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 736, in <module>
| |
| main()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 732, in main
| |
| App().main()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 718, in main
| |
| self.args.func()
| |
| File "/home/marcin_ose/sandbox/glacier-cli/glacier.py", line 500, in archive_upload
| |
| file_obj=self.args.file, description=name)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/vault.py", line 178, in create_archive_from_file
| |
| writer.write(data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 219, in write
| |
| self.partitioner.write(data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 61, in write
| |
| self._send_part()
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 75, in _send_part
| |
| self.send_fn(part)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 222, in _upload_part
| |
| self.uploader.upload_part(self.next_part_index, part_data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/writer.py", line 129, in upload_part
| |
| content_range, part_data)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/layer1.py", line 637, in upload_part
| |
| response_headers=response_headers)
| |
| File "/home/marcin_ose/.local/lib/aws/local/lib/python2.7/site-packages/boto/glacier/layer1.py", line 84, in make_request
| |
| raise UnexpectedHTTPResponseError(ok_responses, response)
| |
| boto.glacier.exceptions.UnexpectedHTTPResponseError: Expected 204, got (408, code=RequestTimeoutException, message=Request timed out.)
| |
| + rm -rf /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.fileList.txt.bz2.gpg /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171201-062001.tar.gpg
| |
| marcin_ose@hancock:~/backups$
| |
| </pre>
| |
| # so it looks like all were successful except the last actual data archive for 'hetzner1_20171201-062001.tar.gpg'
| |
| ## that's pretty awful if we were to loop it. I could add some retry logic, but the dollar cost of accidental uploading it twice is pretty high :\ for now I'll just run the script again for this second upload only for the archive
| |
| ## I also updated the script to only delete a file if $? was 0 after the upload attempt. Then, if it fails, I can manually trigger the upload after reviewing the script output and the contents of the directory..
| |
| | |
| =Mon Apr 02, 2018=
| |
| # dreamhost got back to me on the extension. It wasn't Elizabeth, but Jin. Jin said we could have until Apr 9th. So 3 more days. Hopefully I can get all the data onto glacier by then which is 7 days.
| |
| # the backup of the 'hetzner1_20171001-052001' backup data issued last night was complete
| |
| <pre>
| |
| marcin_ose@hancock:~/backups$ ./uploadToGlacier.sh
| |
| + backupDirs=hetzner1/20171001-052001
| |
| + syncDir=/home/marcin_ose/backups/uploadToGlacier
| |
| + encryptionKeyFilePath=/home/marcin_ose/backups/ose-backups-cron.key
| |
| + export AWS_ACCESS_KEY_ID=CHANGEME
| |
| + AWS_ACCESS_KEY_ID=CHANGEME
| |
| + export AWS_SECRET_ACCESS_KEY=CHANGEME
| |
| + AWS_SECRET_ACCESS_KEY=CHANGEME
| |
| ++ echo hetzner1/20171001-052001
| |
| + for dir in '$(echo $backupDirs)'
| |
| ++ echo hetzner1/20171001-052001
| |
| ++ tr / _
| |
| + archiveName=hetzner1_20171001-052001
| |
| ++ date -u --rfc-3339=seconds
| |
| + timestamp='2018-04-01 19:44:13+00:00'
| |
| + fileListFilePath=/home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.fileList.txt
| |
| + archiveFilePath=/home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.tar
| |
| + echo ================================================================================
| |
| + echo 'This file is metadata for the archive '\hetzner1_20171001-052001'\. In it, we list all the files included in the compressed/encrypted archive (produced using '\ls -lahR hetzner1/20171001-052001'\), including the files within the tarballs within the archive (produced using '\find hetzner1/20171001-052001 -type f -exec tar -tvf '\{}'\ \; '\)'
| |
| + echo ''
| |
| + echo ' - Michael Altfield <maltfield@opensourceecology.org>'
| |
| + echo ''
| |
| + echo ' Note: this file was generated at 2018-04-01 19:44:13+00:00'
| |
| + echo ================================================================================
| |
| + echo '#############################'
| |
| + echo '# '\ls -lahR'\ output follows #'
| |
| + echo '#############################'
| |
| + ls -lahR hetzner1/20171001-052001
| |
| + echo ================================================================================
| |
| + echo '############################'
| |
| + echo '# tarball contents follows #'
| |
| + echo '############################'
| |
| + find hetzner1/20171001-052001 -type f -exec tar -tvf '{}' ';'
| |
| + echo ================================================================================
| |
| + bzip2 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.fileList.txt
| |
| + gpg --symmetric --cipher-algo aes --passphrase-file /home/marcin_ose/backups/ose-backups-cron.key /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.fileList.txt.bz2
| |
| Reading passphrase from file descriptor 3
| |
| + rm /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.fileList.txt
| |
| rm: cannot remove ‘/home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.fileList.txt’: No such file or directory
| |
| + /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.fileList.txt.bz2.gpg
| |
| + tar -cvf /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.tar hetzner1/20171001-052001/
| |
| hetzner1/20171001-052001/
| |
| hetzner1/20171001-052001/public_html/
| |
| hetzner1/20171001-052001/public_html/public_html.20171001-052001.tar.bz2
| |
| + gpg --symmetric --cipher-algo aes --passphrase-file /home/marcin_ose/backups/ose-backups-cron.key /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.tar
| |
| Reading passphrase from file descriptor 3
| |
| + rm /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.tar
| |
| + /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/sandbox/glacier-cli/glacier.py --region us-west-2 archive upload deleteMeIn2020 /home/marcin_ose/backups/uploadToGlacier/hetzner1_20171001-052001.tar.gpg
| |
| marcin_ose@hancock:~/backups$
| |
| </pre>
| |
| # the inventory still doesn't show the above backups, so I issued a sync
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync deleteMeIn2020
| |
| glacier: queued inventory job for u'deleteMeIn2020'
| |
| [root@hetzner2 glacier-cli]#
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # the restore of the 'hetzner1_20170901-052001' backup data from Glacier that I kicked off yesterday appears to have finished
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive retrieve --wait deleteMeIn2020 'hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name' 'hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates'
| |
| [root@hetzner2 glacier-cli]# timed out waiting for input: auto-logout
| |
| [maltfield@hetzner2 ~]$
| |
| </pre>
| |
| # yep, the files got dropped into the cwd with the archive's description as the filename. This should be more sane going forward now that I've updated the uploadToGlacier.sh script to *not* specify a '--name', which means that glacier-cli will set the archive description to the filename
| |
| # before I do anything, I'm going to make copies of these files in case my gpg/tar manipulations deletes the originals and I don't want to pay to download them from Glacier again!
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# mkdir ../orig
| |
| [root@hetzner2 glacier-cli]# cp hetzner1_20170901-052001.fileList.txt.bz2.gpg\:\ this\ is\ a\ metadata\ file\ showing\ the\ file\ and\ dir\ list\ contents\ of\ the\ archive\ of\ the\ same\ prefix\ name ../orig/
| |
| [root@hetzner2 glacier-cli]# cp hetzner1_20170901-052001.tar.gpg\:\ this\ is\ an\ encrypted\ tarball\ of\ a\ backup\ from\ our\ ose\ server\ taken\ at\ the\ time\ that\ the\ archive\ description\ prefix\ indicates ../orig
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # I renamed the other files to just the filename, removing the description starting with the colon (:), and decrypted them
| |
| <pre>
| |
| [root@hetzner2 glacierRestore]# gpg --batch --passphrase-file /root/backups/ose-backups-cron.key --output hetzner1_20170901-052001.fileList.txt.bz2 --decrypt hetzner1_20170901-052001.fileList.txt.bz2.gpg
| |
| gpg: AES encrypted data
| |
| gpg: encrypted with 1 passphrase
| |
| [root@hetzner2 glacierRestore]# ls
| |
| glacier-cli hetzner1_20170901-052001.fileList.txt.bz2.gpg orig
| |
| hetzner1_20170901-052001.fileList.txt.bz2 hetzner1_20170901-052001.hetzner1_20170901-052001.tar.gpg
| |
| [root@hetzner2 glacierRestore]#
| |
| [root@hetzner2 glacierRestore]# ls
| |
| glacier-cli hetzner1_20170901-052001.fileList.txt.bz2.gpg hetzner1_20170901-052001.tar.gpg
| |
| hetzner1_20170901-052001.fileList.txt.bz2 hetzner1_20170901-052001.tar orig
| |
| [root@hetzner2 glacierRestore]#
| |
| </pre>
| |
| # extracted the decrypted archive that was downloaded from glacier
| |
| <pre>
| |
| [root@hetzner2 glacierRestore]# tar -xf hetzner1_20170901-052001.tar
| |
| [root@hetzner2 glacierRestore]#
| |
| [root@hetzner2 glacierRestore]# ls
| |
| glacier-cli hetzner1_20170901-052001.fileList.txt.bz2 hetzner1_20170901-052001.tar orig
| |
| hetzner1 hetzner1_20170901-052001.fileList.txt.bz2.gpg hetzner1_20170901-052001.tar.gpg
| |
| [root@hetzner2 glacierRestore]#
| |
| </pre>
| |
| # extracted the compressed public_html contents of the wrapper, uncompressed tarball
| |
| <pre>
| |
| [root@hetzner2 public_html]# tar -xjf public_html.20170901-052001.tar.bz2
| |
| [root@hetzner2 public_html]#
| |
| [root@hetzner2 public_html]# du -sh *
| |
| 12G public_html.20170901-052001.tar.bz2
| |
| 20G usr
| |
| [root@hetzner2 public_html]#
| |
| </pre>
| |
| # confirmed that I could read the contents of one of the files after the archive was downloaded from glacier, decrypted, and extract. This completes our end-to-end test of a restore from glacier
| |
| <pre>
| |
| [root@hetzner2 public_html]# head usr/www/users/osemain/w/README
| |
| | |
| ##### MediaWiki
| |
| | |
| MediaWiki is a popular and free, open-source wiki software package written in
| |
| PHP. It serves as the platform for Wikipedia and the other projects of the Wikimedia
| |
| Foundation, which deliver content in over 280 languages to more than half a billion
| |
| people each month. MediaWiki's reliability and robust feature set have earned it a
| |
| large and vibrant community of third-party users and developers.
| |
| | |
| MediaWiki is:
| |
| | |
| [root@hetzner2 public_html]#
| |
| </pre>
| |
| | |
| =Sun Apr 01, 2018=
| |
| # the reinventory of vault archive metadata finished last night
| |
| <pre>
| |
| root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync --max-age=0 --wait deleteMeIn2020
| |
| [root@hetzner2 glacier-cli]# timed out waiting for input: auto-logout
| |
| [maltfield@hetzner2 ~]$
| |
| </pre>
| |
| # but the contents is still stale! My best guess is that there is some delay after an archive is uploaded before it's even available to be listed in an inventory
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # I logged into the aws console, which shows the deleteMeIn2020 vault as having 7 archives @ a total size of 13.1 GB. This inventory was last updated at "Apr 1, 2018 5:41:24 AM". The help alt text says the timezone is my system's time. So that's much later than I initiated the inventory last night.
| |
| # I also checked the aws console billing while I was in. The month just cutover (it's April 1st--but no jokes here), so I can see the entire bill for March came to a total of $0.15.
| |
| ## unfortunately, the majority of the fee was in request: 2,872 requests for a total of $0.14. The storage fee itself was just $0.01. Therefore, I should probably look into how to configure boto to increase the chunk size for 'glacier-cli' uploads. Even still, the total would be ~20x that for all our backup dumps to glacier = 0.14*20 = $2.8. Ok, that's cheap enough. Even $5 should be fine.
| |
| # I tried a refresh again; same result
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # playing around with the glacier-cli command, I came across this job listing
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 job list
| |
| i/d 2018-04-01T03:07:35.470Z deleteMeIn2020
| |
| i/d 2018-03-31T19:47:45.511Z deleteMeIn2020
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # I kicked-off another max-age=0 sync while I poke around (might as well--it takes 4 hours!)
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync --max-age=0 --wait deleteMeIn2020
| |
| | |
| </pre>
| |
| # then I checked the job list again, and it grew!
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 job list
| |
| i/p 2018-04-01T13:43:13.901Z deleteMeIn2020
| |
| i/d 2018-04-01T03:07:35.470Z deleteMeIn2020
| |
| i/d 2018-03-31T19:47:45.511Z deleteMeIn2020
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| ## the top one is the one I just initiated. My best guess is that the 'i' means "Inventory", the 'p' means "inProgress" and the 'd' means "Done"
| |
| # hopped over to the aws-cli command to investigate this further
| |
| <pre>
| |
| hancock% /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/bin/aws glacier list-jobs --account-id - --vault-name deleteMeIn2020
| |
| {
| |
| "JobList": [
| |
| {
| |
| "InventoryRetrievalParameters": {
| |
| "Format": "JSON"
| |
| },
| |
| "VaultARN": "arn:aws:glacier:us-west-2:099400651767:vaults/deleteMeIn2020",
| |
| "Completed": false,
| |
| "JobId": "7Q31uhwAGJ6TmzC-s4nif9ldkoiZYrMh6V1xHE9QoaMGvGcf0qSp7xo76LtrwsVDE5-CIeW1a3UzwnwCiOcZSngCGV1V",
| |
| "Action": "InventoryRetrieval",
| |
| "CreationDate": "2018-04-01T13:43:13.901Z",
| |
| "StatusCode": "InProgress"
| |
| },
| |
| {
| |
| "CompletionDate": "2018-04-01T03:07:35.470Z",
| |
| "VaultARN": "arn:aws:glacier:us-west-2:099400651767:vaults/deleteMeIn2020",
| |
| "InventoryRetrievalParameters": {
| |
| "Format": "JSON"
| |
| },
| |
| "Completed": true,
| |
| "InventorySizeInBytes": 2232,
| |
| "JobId": "gHpKcv0KXVmfoMOa_TrqeVzLFAzZzpCdwsJl-9FeEbQHQFr6LEwzspwE6nZqrEi1HgmeDixjtWbw1JciInf5QxHc9dFe",
| |
| "Action": "InventoryRetrieval",
| |
| "CreationDate": "2018-03-31T23:21:19.873Z",
| |
| "StatusMessage": "Succeeded",
| |
| "StatusCode": "Succeeded"
| |
| },
| |
| {
| |
| "CompletionDate": "2018-03-31T19:47:45.511Z",
| |
| "VaultARN": "arn:aws:glacier:us-west-2:099400651767:vaults/deleteMeIn2020",
| |
| "InventoryRetrievalParameters": {
| |
| "Format": "JSON"
| |
| },
| |
| "Completed": true,
| |
| "InventorySizeInBytes": 2232,
| |
| "JobId": "5COrR-wLYeA8ZTyhlBI50Pq4Egnx5G11OmTZ2lVwpuuJTgdvwbEeC1rY1dzST0fCPRm1-D_pvHH5wyg1fJpIhgHJ4ii0",
| |
| "Action": "InventoryRetrieval",
| |
| "CreationDate": "2018-03-31T16:01:15.869Z",
| |
| "StatusMessage": "Succeeded",
| |
| "StatusCode": "Succeeded"
| |
| }
| |
| ]
| |
| }
| |
| hancock%
| |
| </pre>
| |
| ## so that confirms that all 3x jobs are "InventoryRetrieval" jobs. 2x are "Succeeded" and 1x (the one I just initiated) is "InProgress". The finished ones say the size in bytes is '2232' in both cases. That's 2K, which is probably the size of the inventory itself (ie: the metadata report)--not the size of the vault's archives.
| |
| # got the output of the oldest inventory job
| |
| <pre>
| |
| hancock% /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/bin/aws glacier get-job-output --account-id - --vault-name deleteMeIn2020 --job-id 5COrR-wLYeA8ZTyhlBI50Pq4Egnx5G11OmTZ2lVwpuuJTgdvwbEeC1rY1dzST0fCPRm1-D_pvHH5wyg1fJpIhgHJ4ii0 output.json
| |
| {
| |
| "status": 200,
| |
| "acceptRanges": "bytes",
| |
| "contentType": "application/json"
| |
| }
| |
| hancock% cat output.json
| |
| {"VaultARN":"arn:aws:glacier:us-west-2:099400651767:vaults/deleteMeIn2020","InventoryDate":"2018-03-31T15:25:52Z","ArchiveList":[{"ArchiveId":"qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:35:48Z","Size":380236,"SHA256TreeHash":"a1301459044fa4680af11d3e2d60b33a49de7e091491bd02d497bfd74945e40b"},{"ArchiveId":"lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:50:36Z","Size":280709,"SHA256TreeHash":"3f79016e6157ff3e1c9c853337b7a3e7359a9183ae9b26f1d03c1d1c594e45ab"},{"ArchiveId":"fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:53:00Z","Size":280718,"SHA256TreeHash":"6ba4c8a93163b2d3978ae2d87f26c5ad571330ecaa9da3b6161b95074558cef4"},{"ArchiveId":"zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA","ArchiveDescription":"this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates","CreationDate":"2018-03-31T02:55:04Z","Size":1187682789,"SHA256TreeHash":"c90c696931ed1dc7cd587dc1820ddb0567a4835bd46db76c9a326215d9950c8f"},{"ArchiveId":"Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw","ArchiveDescription":"this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates","CreationDate":"2018-03-31T02:57:50Z","Size":877058000,"SHA256TreeHash":"fdefdad19e585df8324ed25f2f52f7d98bcc368929f84dafa9a4462333af095b"}]}% hancock%
| |
| </pre>
| |
| # got the output of the next newest inventory, which I guess is the one I generated last night
| |
| <pre>
| |
| hancock% /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/bin/aws glacier get-job-output --account-id - --vault-name deleteMeIn2020 --job-id 'gHpKcv0KXVmfoMOa_TrqeVzLFAzZzpCdwsJl-9FeEbQHQFr6LEwzspwE6nZqrEi1HgmeDixjtWbw1JciInf5QxHc9dFe' output.json
| |
| {
| |
| "status": 200,
| |
| "acceptRanges": "bytes",
| |
| "contentType": "application/json"
| |
| }
| |
| hancock% cat output.json
| |
| {"VaultARN":"arn:aws:glacier:us-west-2:099400651767:vaults/deleteMeIn2020","InventoryDate":"2018-03-31T15:25:52Z","ArchiveList":[{"ArchiveId":"qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:35:48Z","Size":380236,"SHA256TreeHash":"a1301459044fa4680af11d3e2d60b33a49de7e091491bd02d497bfd74945e40b"},{"ArchiveId":"lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:50:36Z","Size":280709,"SHA256TreeHash":"3f79016e6157ff3e1c9c853337b7a3e7359a9183ae9b26f1d03c1d1c594e45ab"},{"ArchiveId":"fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:53:00Z","Size":280718,"SHA256TreeHash":"6ba4c8a93163b2d3978ae2d87f26c5ad571330ecaa9da3b6161b95074558cef4"},{"ArchiveId":"zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA","ArchiveDescription":"this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates","CreationDate":"2018-03-31T02:55:04Z","Size":1187682789,"SHA256TreeHash":"c90c696931ed1dc7cd587dc1820ddb0567a4835bd46db76c9a326215d9950c8f"},{"ArchiveId":"Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw","ArchiveDescription":"this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates","CreationDate":"2018-03-31T02:57:50Z","Size":877058000,"SHA256TreeHash":"fdefdad19e585df8324ed25f2f52f7d98bcc368929f84dafa9a4462333af095b"}]}% hancock%
| |
| </pre>
| |
| # so both of the most recently completed inventory results show only 5 archives in the deleteMeIn2020 vault that's stale. We want the metadata for the 'hetzner1/20170901-052001' backup so we can test to make sure that a >4G archive restoration works before proceeding with dumping the rest of the bacukups into glacier.
| |
| # I sent an email to Elizabeth at Dreahost telling her that we already reduced our usage by ~250G, and I asked for an additional 2 weeks so we could validate our Glacier POC before uploading the data before they delete it.
| |
| ...
| |
| # I biked 40km, ate lunch, and checked again; the inventory was complete & now listed the backups that I uploaded last night = 'hetzner1_20170901-052001'
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 vault sync --max-age=0 --wait deleteMeIn2020
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive list deleteMeIn2020
| |
| hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name
| |
| hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates
| |
| id:zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates
| |
| id:qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| id:fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw this is a metadata file showing the file and dir list contents of the archive of the same name
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # as nice as glacier-cli is for uploading, the list tries to simplify things (treating archvies as files), so it excludes the actual archive id. And it lacks the hash, size, & creation timestamp. To get this, I got the json job output using the aws-cli
| |
| <pre>
| |
| hancock% /home/marcin_ose/.local/lib/aws/bin/python /home/marcin_ose/bin/aws glacier get-job-output --account-id - --vault-name deleteMeIn2020 --job-id '7Q31uhwAGJ6TmzC-s4nif9ldkoiZYrMh6V1xHE9QoaMGvGcf0qSp7xo76LtrwsVDE5-CIeW1a3UzwnwCiOcZSngCGV1V' output.json
| |
| {
| |
| "status": 200,
| |
| "acceptRanges": "bytes",
| |
| "contentType": "application/json"
| |
| }
| |
| hancock% cat output.json
| |
| {"VaultARN":"arn:aws:glacier:us-west-2:099400651767:vaults/deleteMeIn2020","InventoryDate":"2018-04-01T09:41:24Z","ArchiveList":[{"ArchiveId":"qZJWJ57sBb9Nsz0lPGKruocLivg8SVJ40UiZznG308wSPAS0vXyoYIOJekP81YwlTmci-eWETvsy4Si2e5xYJR0oVUNLadwPVkbkPmEWI1t75fbJM_6ohrNjNkwlyWPLW-lgeOaynA","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:35:48Z","Size":380236,"SHA256TreeHash":"a1301459044fa4680af11d3e2d60b33a49de7e091491bd02d497bfd74945e40b"},{"ArchiveId":"lEJNkWsTF-zZ1fj_2XDVrgbTFGhthkMo0FsLyCb7EM18JrQ-SimUAhAi7HtkrTZMT-wuYSDupFGDVzh87cZlzxRXrex_9NHtTkQyp93A2gICb9zOLDViUr8gHJO6AcyN-R9j2yiIDw","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:50:36Z","Size":280709,"SHA256TreeHash":"3f79016e6157ff3e1c9c853337b7a3e7359a9183ae9b26f1d03c1d1c594e45ab"},{"ArchiveId":"fOeCrDHiQUrbvZoyT-jkSQH_euCAhtRcy8wetvONgUWyJBYzxM7AMmbc4YJzRuroL57hVmIUDQRHS-deAo3WG0esgBU52W2qes-47L1-VkczCpYkeGQjlNFGXaKE7ZeZ6jgZ3hBnpw","ArchiveDescription":"this is a metadata file showing the file and dir list contents of the archive of the same name","CreationDate":"2018-03-31T02:53:00Z","Size":280718,"SHA256TreeHash":"6ba4c8a93163b2d3978ae2d87f26c5ad571330ecaa9da3b6161b95074558cef4"},{"ArchiveId":"zr-OjFat_oTJ4k_bMRdczuqDL_GNBpbgTVcHYSg6N-vTWvCe9FNgxJXrFeT26eL2LiXMEpijzaretHvFdyFYQarfZZzcFr0GEEB2O4rVEjtslkGuhbHfWMIGFbQZXQgmjE9aKl4EpA","ArchiveDescription":"this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates","CreationDate":"2018-03-31T02:55:04Z","Size":1187682789,"SHA256TreeHash":"c90c696931ed1dc7cd587dc1820ddb0567a4835bd46db76c9a326215d9950c8f"},{"ArchiveId":"Rb3XtAMEDXlx4KSEZ_-OdA121VJ4jHPEPHIGr33GUJ7wbixaxIzSa5gXV-2i_7-AH-_KUCuLMQbmMPxRN7an7xmMr3PHlzdZMXQj1YTFlJC0g2BT2_F1HJf8h6IocDcR-7EJQeFTqw","ArchiveDescription":"this is a compressed and encrypted tarball of a backup from our ose server taken at the time the archive name indicates","CreationDate":"2018-03-31T02:57:50Z","Size":877058000,"SHA256TreeHash":"fdefdad19e585df8324ed25f2f52f7d98bcc368929f84dafa9a4462333af095b"},{"ArchiveId":"P9wIGNBbLaAoz7xGht6Y4k7j33nGgPmg0RQ4sesN2tImQLjFN1dtkooVGrBnQqbPt8YhgvwUXv8eO_N72KRjS3RrZQYvkGxAQ9uPcJ-zaDOG8kII7l4p7UzGfaroO63ZreHItIW4GA","ArchiveDescription":"hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name","CreationDate":"2018-03-31T22:46:18Z","Size":2299038,"SHA256TreeHash":"2e789c8c99f08d338f8c1c2440afd76c23f76124c3dbdd33cbfa9f46f5c6b2aa"},{"ArchiveId":"o-naX0m4kQde-2i-8JZbEESi7r8OlFjIoDjgbQSXT_zt9L_e7qOH3HQ1R7ViQC3i7M0lVLbODsGZm9w9HfI3tHYKb2R1T_WWBwMxFuC_OhYiPX8uepTvvBg2Mg6KysP9H3zNzwGSZw","ArchiveDescription":"hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates","CreationDate":"2018-03-31T23:47:51Z","Size":12009829896,"SHA256TreeHash":"022f088abcfadefe7df5ac770f45f315ddee708f2470133ebd027ce988e1a45d"}]}% hancock%
| |
| </pre>
| |
| # so I want to restore these 2x archives:
| |
| ## P9wIGNBbLaAoz7xGht6Y4k7j33nGgPmg0RQ4sesN2tImQLjFN1dtkooVGrBnQqbPt8YhgvwUXv8eO_N72KRjS3RrZQYvkGxAQ9uPcJ-zaDOG8kII7l4p7UzGfaroO63ZreHItIW4GA
| |
| ## o-naX0m4kQde-2i-8JZbEESi7r8OlFjIoDjgbQSXT_zt9L_e7qOH3HQ1R7ViQC3i7M0lVLbODsGZm9w9HfI3tHYKb2R1T_WWBwMxFuC_OhYiPX8uepTvvBg2Mg6KysP9H3zNzwGSZw
| |
| # unfortunately, it appears that the 'glacier-cli' tool also doesn't allow you to specify the id for restoring. I think I'm going to have to just make the archive description a filename at upload to play nice with glacier-cli. in the meantime, I'll use my insane "name" which is actually a human-readable description for the archive (as amazon intended for this field)
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive retrieve --wait deleteMeIn2020 'P9wIGNBbLaAoz7xGht6Y4k7j33nGgPmg0RQ4sesN2tImQLjFN1dtkooVGrBnQqbPt8YhgvwUXv8eO_N72KRjS3RrZQYvkGxAQ9uPcJ-zaDOG8kII7l4p7UzGfaroO63ZreHItIW4GA' 'o-naX0m4kQde-2i-8JZbEESi7r8OlFjIoDjgbQSXT_zt9L_e7qOH3HQ1R7ViQC3i7M0lVLbODsGZm9w9HfI3tHYKb2R1T_WWBwMxFuC_OhYiPX8uepTvvBg2Mg6KysP9H3zNzwGSZw'
| |
| glacier: archive 'P9wIGNBbLaAoz7xGht6Y4k7j33nGgPmg0RQ4sesN2tImQLjFN1dtkooVGrBnQqbPt8YhgvwUXv8eO_N72KRjS3RrZQYvkGxAQ9uPcJ-zaDOG8kII7l4p7UzGfaroO63ZreHItIW4GA' not found
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # but first I tried a substring; that failed too
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive retrieve --wait deleteMeIn2020 'hetzner1_20170901-052001.fileList.txt.bz2.gpg' 'hetzner1_20170901-052001.tar.gpg'
| |
| glacier: archive 'hetzner1_20170901-052001.fileList.txt.bz2.gpg' not found
| |
| [root@hetzner2 glacier-cli]#
| |
| </pre>
| |
| # ok, using the whole description as the name
| |
| <pre>
| |
| [root@hetzner2 glacier-cli]# ./glacier.py --region us-west-2 archive retrieve --wait deleteMeIn2020 'hetzner1_20170901-052001.fileList.txt.bz2.gpg: this is a metadata file showing the file and dir list contents of the archive of the same prefix name' 'hetzner1_20170901-052001.tar.gpg: this is an encrypted tarball of a backup from our ose server taken at the time that the archive description prefix indicates'
| |
| </pre>
| |
| # that command appears to be waiting; I'll check on that tomorrow
| |
| # in the meantime, on the dreamhost server, I'll fix the script to just use the filename as the description & upload the next archive using this setting
| |
| # note that it's a new month, so dreamhost has our first-of-the-month backups for 2018-04-01. Therefore, here's the updated list I'll be sending to this deleteMeIn2020 vault (total 276G)
| |
| <pre>
| |
| 20G hetzner2/20170702-052001
| |
| 1.7G hetzner2/20170801-072001
| |
| 1.7G hetzner2/20170901-072001
| |
| 2.5G hetzner2/20171001-072001
| |
| 838M hetzner2/20171101-072001
| |
| 997M hetzner2/20171202-072001
| |
| 1.1G hetzner2/20180102-072001
| |
| 14G hetzner2/20180202-072001
| |
| 25G hetzner2/20180302-072001
| |
| 2.8G hetzner2/20180401-072001
| |
| | |
| 39G hetzner1/20170701-052001
| |
| 39G hetzner1/20170801-052001
| |
| 12G hetzner1/20170901-052001
| |
| 12G hetzner1/20171001-052001
| |
| 12G hetzner1/20171101-062001
| |
| 12G hetzner1/20171201-062001
| |
| 12G hetzner1/20180101-062001
| |
| 27G hetzner1/20180201-062001
| |
| 28G hetzner1/20180301-062002
| |
| 12G hetzner1/20180401-052001
| |
| </pre>
| |
| # the last backup was ' hetzner1/20170901-052001', so for this test I'll use the following month that's the same size (12G, which is our smallest size that exceeds the 4G limit) = 'hetzner1/20171001-052001'
| |
| # I added a line to delete the contents of the 'uploadToGlacier/' directory after the upload to glacier
| |
| # I updated the uploadToGlacier.sh script to attempt to upload the next 2x backups by setting 'backupDirs="hetzner1/20171101-062001 hetzner1/20171201-062001"'
| |
| # I kicked-off the uploadToGlacier.sh script. If the next sync in ~48 hours shows all 3x backups using the description as the file name (per glacier-cli's desire), then I think I can execute this script for the remaining backups.
| |